Couldn’t attend Transform 2022? Check out all the top sessions in our on-demand library now! Look here.
You are a walking data repository. When you are outside your home or vehicle, walking down a street, shopping in a store, or attending a public event or meeting, you may lose your personal privacy and cross the line from private person to virtual public figure. You can be filmed or photographed, your image can be transported to a storage silo anywhere in the world, your voice can be recorded and your time in public recorded. This is the world we live in 2022.
When you go online to make a purchase, a whole new door opens to others of your personally identifiable information (PII). You will invariably voluntarily provide your name, address, telephone number, email address, and possibly more detailed information about yourself to strangers. Ostensibly, this data remains private between you and the supplier. “Apparently” is the key word here, however; a never For real know how much of your PII remains legitimately private.
Everything quoted above can become data and stored on your record anywhere in the world whether you like it or not. Over-the-top heavy review? Possibly, but it’s up to you to know and act on it.
What information qualifies as personally identifiable information?
According to the United States Department of Labor, (DoL) companies may track PII of their employees, customers, customers, students, patients or other individuals, depending on the industry. PII is defined as information that directly identifies an individual (e.g., name, address, social security number or other identification number or code, telephone number, email address, etc.). It can also mean information by which an agency seeks to identify specific individuals using other data elements, such as a combination of gender, race, date of birth, geographic indicator, and other descriptors.
MetaBeat will bring together thought leaders to offer advice on how metaverse technology will change the way all industries communicate and do business October 4 in San Francisco, CA.
Whether you want this PII to end up in the hands (or databases) of numerous outsiders is largely, but not entirely, your own decision. The DoL specifically says, “It is the individual user’s responsibility to protect data to which they have access.”
People have long been uncomfortable with the way companies can track their movements online, often collecting credit card numbers, addresses and other critical information. They were terrified of being followed around the web by ads that had clearly been triggered by their online searches, leaving them with constant concerns about identity theft and fraud. This is a direct result of PII falling into the hands of companies looking to capitalize on your movements on the web.
Those concerns have led to the adoption of regulations in the United States and Europe that will guarantee internet users some degree of control over their personal data and images – most importantly, the European Union 2018 General Data Protection Regulation (GDPR). Those measures, of course, did not end the debate about the use of personal data by companies; they are only a starting point for deeper and more specific laws.
The California Consumer Privacy Act is a prime example, a data privacy law (enacted in 2020) that provides privacy rights to California residents, giving them options about how their PII can be used. There’s also California’s Automated Decisions Systems Accountability Act (still in the legislative process), which aims to end algorithmic bias against groups protected by federal and state anti-discrimination laws.
Privacy, AI regulation running in parallel
Data privacy laws and regulation of data collected for the use of artificial intelligence run parallel through government agencies because they are so intertwined.
Any time a human is involved in an analysis project, bias can be introduced. AI systems that produce biased results have even made headlines. A highly publicized example is Apple’s credit card algorithm, who has been accused of discriminating against women and has led to an investigation by the New York Department of Financial Services. Another is the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) algorithm used in US legal systems to predict the likelihood that a suspect will become a repeat offender. This one in particular has been wrong several times.
As a result of all this PII collection, the meteoric rise of the use of analytics and machine learning in online applications, and the constant threat of bias in AI algorithms, law enforcement agencies are chasing an increasing number of citizens’ complaints about online fraud.
Governments too are trying to get the guns around the right legislation across the state to curb this criminal activity.
The state of AI regulation
Are there rules for artificial intelligence? Not yet, but they are coming. States can act faster than the federal government on this, which is no surprise. For two years now, the California legislature has been debating and amending the Automated Decision Making Systems Accountability Act, which requires government agencies to use an acquisition method that minimizes the risk of adverse and discriminatory effects from the design and application of computerized decision systems. There’s a possibility it could become law later this year or early next year.
This is just the first wave of new laws and regulations that will impact online businesses and their customers in the coming years. There is ample evidence that stricter regulations are needed to curb big-bag companies like Google and Amazon, which become virtual monopolies through the continued use of their users’ PII.
There is no doubt that the ocean of PII is the fuel that analytics use to produce data that can lead to business value. Analytics is the foundation for artificial intelligence that can suggest a strategy correction for a company, warn of an impending problem in the supply chain, or predict where a market is headed in months or years. All of this matters to a company and its investors, not to mention all the employees, partners, contractors, and customers who depend on the company itself.
Bobby Napiltonia is the president of okera.
Welcome to the VentureBeat Community!
DataDecisionMakers is where experts, including the technical people who do data work, can share data-related insights and innovation.
If you want to read about the latest ideas and up-to-date information, best practices and the future of data and data technology, join DataDecisionMakers.
You might even consider contributing an article yourself!
Read more from DataDecisionMakers