Categories
datacappy dsdefender

Know the Sesame Score

If you’re reading this from anywhere but China, you probably don’t have a Sesame Score. But if you care at all about internet privacy and security, you should know what it is.

The Chinese government has created a social credit monitoring system to look over every member of its population. It’s called Zhima Credit and is popularly known as Sesame Credit or China’s Sesame Score. It was developed by Ant Financial, which is owned by Alibaba, the major online marketplace of China that competes with sites like eBay and Amazon.

Like a Loyalty Card

As the Washington Post put it, China’s Sesame Score operates like a loyalty card scheme that monitors and measures how trustworthy a person is, based on their credit history and online activities. It’s tracking to see if you are a good citizen or more of a trouble maker.

If you exhibit positive traits—always paying bills on time—while refraining from “untrustworthy” or even illegal actions, your score will be higher. If you spread lies on social media, try to avoid paying taxes or commit traffic violations, your score goes down.

Everyone gets an individualized score. People who have high scores are eligible for loans at more favorable rates than those with lower Sesame Scores. They also enjoy better access to car-sharing systems and are eligible for free healthcare checkups. Sound great. But there are downsides, too. The Washington Post cites reports of people with low credit scores being blacklisted and suddenly unable to purchase plane tickets.

Pros and Cons

Proponents of this system say it discourages bad citizenship and rewards good behavior. In China, 80 percent of citizens approve of the Sesame Score and other social credit efforts, according to the Post. Furthermore, 76% of Chinese people responding to a poll generally distrust Chinese society. They view a social credit system as useful for combating social ills ranging from con artists to polluters.

But from an outsider’s perspective, the Big Brother comparisons are obvious and the risks of abuse and repression glaring. China’s electronic surveillance directly affects tourists and foreigners who do business in the country. Forbes reports that “new cybersecurity laws give the Chinese government access to files, contracts, copyrights, business strategies and phone records with no permission asked” of foreign companies. China now requires businesses to gather and provide their internal information to a database to determine the “moral codes” of corporations that can lead in turn to punishment or reward.

Is this a harbinger of things to come in the United States? Powerful, largely anonymous corporations have already compiled profiles on internet users here. Those profiles can determine which ads you see and how long you wait on hold when calling customer service. Big Tech has been tracking all of us for years. In some ways, the Sesame Score isn’t all that foreign.

Reprinted with permission from privacyparent.com

Categories
datacappy dsdefender oliverwjones

Incognito, uh huh

All the press and the REpress about this $5Billion lawsuit against Google makes me realise how far Google has gone down the rabbit hole. Google’s legal team are juggling a few of these Billion dollar suits. Around the world. A normal days work… Guess where your advertising dollars are going.

What makes me really curious though is, who chose the name? Seriously. That guy/girl/other hit a home run. If you call your product incognito, does it mean you’re private? The judges are asking the same question but sit safely on the side of, ‘you didn’t clearly disclose…’. And why would they? And Google of course says, ‘yes, we did, look!’

Besides the hefty number being bantered around, the only real interest in this little spat, is that maybe now, a few more users will become aware that ‘incognito’, is not really what they think it is.

Categories
datacappy dsdefender

Too Much Power in the Hands of Too Few

In the effort to protect our online transactions, have we ceded too much power to too few? Is Sift, whose algorithms protect corporate behemoths from McDonald’s to Twitter, merely a free-market version of China’s intrusive Sesame Score? What is the cost of safety?

Online shopping may be pervasive, but it isn’t without risks. Credit card fraud, for example, is on the rise each year, eclipsing $30 billion annually, with the majority of the fraud impacting United States consumers. The COVID pandemic has only made things worse: April 2020 saw attempted fraudulent transactions increase by 35 percent.

One company is positioning itself as the answer: Sift, a technology security company founded in 2011. Sift’s stated goal is to eliminate fraud on the Internet by utilizing machine learning to combat illegal and suspicious activity. Sift claims that the average business partner sees an approximate 50 percent reduction in payment fraud.

One of Sift’s most prominent assertions is that good users shouldn’t have to be penalized for the work of criminals. In other words, everyday users shouldn’t be forced to authenticate themselves to an extreme degree. Sift claims it’s able to separate the good users from the bad users in virtually every circumstance due to their website activity. On paper, it’s a great thing, and indeed, it may help to keep the Internet a safer place.

However, there are some significant privacy concerns at play. If Sift is able to interpret user activity to determine if someone is a regular user or a hacker, how much data is Sift really tracking about Internet users? Where is that data being stored? What happens if that data gets released?

The questions don’t end there. Because Sift is so pervasive and works with so many large retailers, it’s privy to a lot of information. Sift’s machine learning makes ample use of this data, constantly creating and evolving user profiles on the back end. It’s not dissimilar to China’s Sesame Score, which is essentially an all-encompassing rating that combines credit scores with social factors. That rating can then be used to exclude individuals from certain parts of society. Sift’s intentions don’t seem nefarious. But who’s to say Sift can’t decide one day to deem certain user types as fraudulent and ban them from making online transactions? Is that too much power for one company?

So far, Sift’s work has helped businesses to save billions. But the potential for a data organization like Sift to amass too much power is real. A catastrophic data breach or a malfunction of Sift’s machine learning to go haywire would lead to chaos in the world of online transactions.

Reprinted with permission from: PrivacyParent.com

Categories
datacappy dsdefender oliverwjones

Silver Sparrow Singing for Apple M1 Chips

There has been a lot of concern about an odd virus infecting Apple’s newest M1 devices. Under normal conditions viruses infect Apple products less than Windows. Red Canary who discovered Silver Sparrow, (why would you name a virus after a close relative of your company?) is still trying to understand it’s purpose. After having infected 30,000 plus M1 chips, what it does is still a mystery. This article from the BBC suggests Silver Sparrow might just be a proof of concept in order to show that it’s not really that difficult to move dangerous code onto a Mac. That seems reasonable…

Categories
datacappy dsdefender

COPPA: How a 1998 Law (Sort of) Protects Your Kids

COPPA—the Children’s Online Privacy Protection Act—was supposed to help guard our kids’ privacy on the internet. More than two decades after its passage, the results are mixed. Yes, COPPA protects some kids, but it is too narrow in scope and needs to be updated.

Congress enacted the Children’s Online Privacy Protection Act (COPPA) in 1998 to give parents control over what kinds of information companies can gather about their kids and improve security. It’s designed to protect children who use the internet by requiring websites to post their online privacy policy along with a notice to parents to obtain consent before gathering data. Parents are also supposed to be able to access any information collected on their kids for review or to have it deleted.

COPPA applies to commercial websites and apps that are directed at kids aged 13 and under, leaving kids older than 13 unprotected. Personal information includes details such as the child’s first and last name, phone number, Social Security number and even geolocation data (your kid’s street and town). The court can levy penalties of up to $43,280 for each violation, according to Medium.

California Steps In

California has a tougher version, called the California Online Privacy Protection Act (CalOPPA). In the Golden State, CalOPPA applies to operators of online services, apps or commercial websites that collect information on individuals living in California. This means that the website itself does not need to be a business registered in the State of California, according to TermsFeed.

CalOPPA dictates that websites indicate what categories of personal information can be collected (such as email addresses and shipping addresses). Also to be listed are any third parties that might receive and process this information for the site. 

As an example of repercussions, CalOPPA led to the California Attorney General fining app providers by as much as $2,500 each time they failed to provide compliance plans for collecting personally identifiable information.

Strengthening COPPA

Efforts to improve and strengthen COPPA are ongoing, as evidenced by the FTC hosting a recent public workshop called “The Future of the COPPA Rule.” As for future updates to legislation like COPPA, the FTC is not required to take action this year, with the next mandatory review being scheduled in 2023.

One area parents should be concerned about is whether the FTC should grant exceptions for education technology and equipment such as Google Home and Alexa that interact using voice recognition, as noted by a report from the New York Times. This issue is going to persist as long as people continue to make inquiries by voice. It’s grows in popularity for ease of use, but it also results in more personal data to potentially be harvested.

Another concern: the age limit. For the most part, we mark the age of legal adulthood at 18. The fact that COPPA cuts off protection for kids over 13 is a serious mistake. Is your 14-year-old not a kid?

COPPA protects, but on its own is an inadequate shield. Parents who care about their children’s online safety and privacy must be vigilant and set ground rules for computer, tablet and phone usage. Meantime, call your legislators and tell them you want a stronger, tougher COPPA.

Published with permission from PrivacyParent.

Categories
datacappy dsdefender oliverwjones

The Right to be Remembered

The Council of the European Union, the European Commission and the European Parliament designed the GDPR (General Data Protection Regulation) in an effort to protect people’s personal data and privacy when on a website. Through the GDPR, websites are now required to disclose to users that cookies will be used to track their digital footprint and allow users to opt out. Most users find it cumbersome to complete the process and disregard the option. There is also a regulation that grants users the right to be forgotten which allows them to contact companies that collect, analyze and sell their data and request that they be deleted from their databases.  

GDPR laws were not designed in the interest of the user but rather in the interest of the Big Tech giants: Google, Facebook, Amazon, Apple and Microsoft. These key content providers and data collectors, along with many others, profit on the acquisition of user data. Google’s business model masters the art of data collection with Facebook, Amazon and Apple following in their footsteps. Microsoft, while a slightly smaller player in this arena, should not be overlooked for their influence in helping craft the GDPR.

Along with Big Tech, politicians have used the GDPR as a false cover to pretend they are concerned with digital privacy. If politicians really wanted to protect users’ digital data from Big Tech, then the laws would have been differently crafted. The right to be forgotten has confusing loopholes and is nearly impossible for users to implement. In order to be forgotten, a user must make a request to every individual company that collects data and ask to be removed. It’s a daunting task because there are so many companies who collect, analyze and sell data and it’s almost impossible to track them all down. Even if you could contact all these companies, they are constantly generating new partners with whom they transfer your data and the endless cycle continues. This is insanity.  If the laws were crafted with users’ privacy as the primary concern, they wouldn’t be called ‘the right to be forgotten’ but rather ‘the right to be remembered’.  

Let’s pretend the GDPR laws allowed users the same, simple ability to opt out of being tracked as it currently does now to opt in with a single click. If this were the case, websites would never get any user information! Big Tech companies are not affected by GDPR regulations because people are lazy, don’t care or simply do not understand what’s at stake.  If there was a right to be remembered then users would have a real choice, not an obscure, confusing set of steps deeply entrenched within a privacy policy or user agreement. The problem is that users are “informed” but not in ways they can ever access. Who reads a privacy policy or user agreement? It’s easy to slip in consent to collect data in these online documents. It would be much more difficult if every third party host who wanted to place a tracking cookie on your device had to ask for your consent every time. No one would take the time to write companies to allow themselves to be tracked and people would have a real chance at protecting their digital information. These laws need to be re-thought and re-crafted to truly protect digital information.

Categories
datacappy dsdefender oliverwjones

My TV is Spying on Me!

There’s a recent article in the New York Times which reads a little nostalgic, but makes a good point. There’s no space left for smaller players in the consumer electronic space. Even given the innovation happening, smaller players get gobbled up quickly, and consumed into the huge data crunching consumer giants. Think fitbit… The other point Shira Ovide makes is that the largest consumer electronics companies have basically become “The Facebook that happens to sell us the screens, too”. In short if they can watch you jog, watch you watch, and watch you play, that data is as valuable as any physical electronic device they sell you.

Categories
datacappy dsdefender oliverwjones

PRISM

The Internet has given us nearly unlimited access information of all kinds. We can learn about anything that interests us with a few clicks. So can our government, and when their topic of interest is your internet use, that’s a concern.

The federal government’s primary tool for monitoring the Internet activity of U.S. citizens is an initiative called PRISM, launched in the years following 9/11 and immediately covered up. Few people knew anything about it until 2013, when whistleblower Edward Snowden exposed it.

PRISM isn’t an acronym; instead, it’s a codeword that summarizes the government’s attempt to take large quantities of information and focus it in specific areas. In particular, PRISM allows the government to obtain information about individual users directly from the Internet’s biggest sites, such as Google, Facebook and YouTube. Also included in PRISM’s surveillance is mobile data, including location information and text messages.

The government claims that it only does this surveillance to ensure the safety of its citizens, and that PRISM only collects data when there is a given reason to look at the activity of a particular user. However, just as your Echo Dot can hear your conversations when you haven’t specifically said “Alexa”, there’s no guarantee that the government isn’t watching more of your Internet activity than you’re aware of.

Critics of PRISM claim that the program is unconstitutional. The truth is, we don’t really know what the government is doing with our data. The data may simply help the government to identify legitimate issues that might otherwise have gone unnoticed. Or, the government may be compiling data to one day be used against us. Since there is so much grey area surrounding PRISM, which the government has neglected to clarify, many people assume the worst.

What should you, a law-abiding citizen, do about PRISM? Treat your personal data carefully. Protect it. Use safeguards wherever possible, including VPNs, data encryption and private browsing. Be wary of public wifi networks and online storage tools. Above all, be aware of the threats surrounding you, and make choices regarding you and your family’s Internet privacy knowing what’s lurking just out of sight.

Categories
datacappy dsdefender oliverwjones

Facial Rec Privacy Outrage, Again

18 C0-op food stores decided to test a system targeting shop-lifters and violent store visitors using a product from Facewatch. Where’d they get that name?

In an open letter to the retailer, Privacy International questioned the legality of the technology in stores. It also asked whether information was being shared with the police.”

Facial recognition tech has become extremely controversial. Issues of racial profiling, come to mind but this stuff is going to be hard to put back in the box. A decent article from the BBC.

Categories
datacappy dsdefender oliverwjones

EU Algo Policing

If this was even remotely possible… “The Commission said the guidelines require online platforms to identify the algorithmic parameters that determine ranking and to share them with companies.” I can imagine the wittier of the group responding with a single parameter, “result=”.

Margrethe Vestager is becoming something of a star in the European Commission, The Commission. Reuters gives you the basics here.