Editorial: We must be aware of emerging technology’s risks

The Editorial Board

Humanity has reached a point where our world coincides with that of a science fiction film.

Portable phones have become obligatory personal assistants, multitasking as calendars, computers, health trackers and more. Cars, homes and speakers are interactive and personalized, collecting data to better serve their owner. Emerging technology is no longer limited to the extremely wealthy or powerful: It serves us all.

However, anything that serves us all will also affect us all, both positively and negatively. According to a 2017 Pew Research study, 92 percent of 18- to 29-year-olds in the United States own a smartphone. So, in the event that using smartphones was suddenly discovered to have an unknown negative consequence, 92 percent of Americans in that age range could be impacted.

Personal technology holds increasing influence over the world population, which is why both tech companies and the public must serve as watchdogs to prevent potentially devastating results.

Consumers need to be aware and educated on the results of their technological footprint. With the rise of private tech companies that track and use personal data, it is impossible to know if your personal information is secure or being capitalized upon.

A chilling example is the Uber data breach. In 2016, Uber revealed that 57 million user and driver accounts had been hacked and stolen. The most frightening part? Uber kept the breach a secret for a year after paying a ransom amount of $100,000 to the hackers to delete the stolen data. Private companies can easily keep the public in the dark concerning what happens with their data, so consumers must remain hyper-aware of where they put their information online.

Naturally, the higher the reward is, the higher the risk could be, especially in terms of technological advancement. All computing devices rely on the collection of data, so the more data that is collected, the more efficient the computer can be. However, this also means more data is at risk. Whether it is from ill-intentioned humans, or simply unpredicted negative consequences, our technological utopia could quickly become a horror story.

“Black Mirror,” a British television series that aired its first season in 2011, explores the worst-case scenarios when such horrors become pressing reality. Each episode is stand-alone and depicts a different high-tech, near-future dystopia. The final episode of the third season, titled “Hated in the Nation,” follows the story of a swarm of robotic bees, funded by the government to combat the decrease in pollination after bee extinction. However, the government also uses the robotic bees as surveillance. The technology goes horribly awry when the bees are compromised by a hacker who uses their facial recognition ability to send them to murder certain targets.

To some, “Black Mirror” may seem fantastical and far-fetched. Yet many people unknowingly live in an age where these scenarios are a looming possibility. Apple’s iPhone X launched at the end of last year and is the most extreme reworking of the smartphone yet. It is sleeker and stronger — which is usual in an update — but this model also has facial recognition. Sound familiar? Facial recognition has been commonplace before the iPhone X, such as when Facebook suggests you tag a person in a photo. But this feature epitomizes the amount of privacy people are willing to give up to be part of the newest trend.

That said, it would be unwise to begin aggressively regulating technological developments. Hindering scientists without due cause would be both unpopular and illogical, given the miraculous achievements humans have unlocked with technology. The development of technology is human nature; even fire was once an emerging technology with its own set of dangers.

For the time being, the most palatable option is for consumers themselves to regulate technology through consumption, and for consumers to understand where they are inputting their data and the associated risks. Members of the public should also hold tech creators accountable for informing them what is being done with that information.

Consumers need to adapt to the concept that they too are responsible for their digital footprint. We cannot remain a faceless mass, unhesitatingly waiting in line for the technology that could well become our undoing.

On the developer side, tech companies and developers should be lawfully required to be transparent with what data is being used and how, as well as forthcoming in the case of a potential problem in the system.

Although transparency might hinder capitalist competition and advertising on certain products, it is the price the market must pay for increased security. Most of all,  the public should not wait to have this increasingly important discussion. While “Black Mirror” may have been created to entertain, it is also an urgent warning to humanity to look at our own reflection every so often. Now is the time to look in the mirror, no matter how frightened we may be of what we will see.