The post-World War 1 years were a tumultuous time politically in the U.S., but today few people remember that era because of the Teapot Dome Scandal of the Warren G. Harding administration that dominated headlines. In retrospect, the most important changes of that era period were technological: Henry Ford’s Model T took over the roads and replaced horses, while electrification brought radios into almost every home.

It’s possible that regardless of the media’s fixation on the current president, we are living through a similar time in history. Forget about the day-to-day political news. In 100 years will future generations remember whether the president’s speech to Congress was well received? Probably not. But they will almost certainly remember how our society was changed by technologies like drones, self-driving cars, and artificial intelligence.

Leaders in the tech industry started taking these technologies very seriously in the last years of the Obama administration, and the president had to think about how these revolutionary advances should be regulated and controlled. And when Obama wanted expert advice on technology issues, he turned to a brain trust in a little known but influential agency called the White House Office of Science and Technology Policy, where Princeton computer science professor Ed Felten held the post of deputy U.S. chief technology officer.

Since Felten returned to Princeton from his stint in government, he has resumed his role as a public intellectual on technology topics. On February 13 Felten gave a presentation at Princeton on his 20-month tour of duty at the White House, where he advised the president on computer science issues, and even met with him face-to-face on several occasions.

On Monday and Tuesday, March 20 and 21, Felten will participate in the Princeton-Fung Global Forum in Berlin, Germany, the fourth iteration of an annual event where Princeton showcases its role as an international intellectual powerhouse (see sidebar, page 27). The forum, organized by the Woodrow Wilson School of Public and International Affairs, will also feature “father of the Internet” Vinton Cerf; Tor Project co-developer Roger Dingledine; former vice president and commissioner for digital economy and society for the European Commission Neelie Kroes; and president and chief legal officer of Microsoft, Brad Smith.

The forum, “Society 3.0+: Can Liberty Survive the Digital Age?” will focus on cybersecurity and digital communications, including topics like privacy and human rights versus security protection, vulnerabilities versus efficiencies posed by the Internet of Things, and other topics. These issues are familiar ground for Felten, who has written about many of them on his blog, Freedom to Tinker.

When Felten went to work in the White House, he was not the first Princeton professor to do so. Many graduates and faculty have served in high offices, going back to James Madison. Felten wasn’t even the only Princeton professor to serve in the Obama administration: In 2009 the president tapped economist Cecilia Rouse, now dean of the Woodrow Wilson School, to be a member of the Council of Economic Advisers. George W. Bush named economics professor Harvey Rosen chairman of the Council in 2005.

Growing up, Felten had always been interested in programming as a hobby and used his technical skills to help his father computerize the family plumbing supply business. He was a physics major at the California Institute of Technology, switching to computer science late in his college career.

When Felten graduated from Caltech in 1985, he went to work for an experimental group that was using computers for physics problems. In 1993 he completed his doctorate in computer science and engineering at the University of Washington, joining the Princeton faculty that same year.

Before long, he began studying computer security, becoming interested in the topic when a group of students came to him for help discovering potential security weaknesses in the new Java programming language. He soon became a well known expert in that field. In 1996 he became the director of the Secure Internet Programming lab at Princeton. In 1998 Felten got involved with the federal government, consulting with the U.S. Department of Justice on the Microsoft antitrust case.

Felten made national headlines in 2001 with a research paper on breaking SDMI, a copy protection technology scheme being developed by the recording industry, in response to a “hacking challenge” proposed by SDMI. For his troubles, he was sued by the recording industry in a suit in which he ultimately prevailed after years of court battles.

Later in the 2000s, Felten was in the news again, this time for making a video showing how hackers could easily take over a voting machine.

Before Felten joined the Obama administration, he was a critic of it. He argued vociferously against mass surveillance by the National Security Agency and even joined the ACLU in a lawsuit against the intelligence agency.

Many regarded Felten’s appointment as an unconventional choice for the White House post, since he had been critical of some Obama administration policies. However, he was entrusted with a key advisory role.

As deputy chief technology officer, Felten reported to CTO Megan Smith, who reported to the president directly. His team was responsible for advising the president and his senior advisors on technological matters. Some of his work was public, and some of it he can’t talk about.

“We were policy advisors,” Felten said. “Our job was to make words, to give advice, and to be instigators.” Their missions were to improve the technological capabilities of the government, increase the nation’s capability to build or use technologies, and to make sure the president was well informed about the technology aspects of policy decisions.

In his February 13 talk, Felten described what it was like to walk the corridors of power, literally. He shared a tiny office with six other people in the Eisenhower Office Building adjacent to the West Wing. On his first day in the cramped but opulent 1890s structure, he walked to his office with his eyes to the ground, looking for the fossils that are embedded in the black marble tiles.

Before taking the job, Felten said his conception of what it was like to work in the White House was partly shaped by pop culture portrayals. The political drama “The West Wing” showed an idealistic version of the presidency, in which “earnest, hardworking staffers solve the nation’s problems while talking and walking briskly down hallways,” in Felten’s description. He said the walking down hallways part, at least, was accurate, because it was hard to have meetings in the tiny offices. Overall, the real experience mostly lived up to the positive portrayal of “The West Wing.” “People are incredibly dedicated and competent … They are exactly the kind of people you would hope would be working at the White House,” Felten said.

But more cynical takes on the White House also had elements of truth. Another way of looking at the presidency is through “Veep,” the HBO comedy starring Julia Louis-Dreyfus, where dysfunction and incompetence are the norm. “In real life, dumb little errors do show up,” Felten said. “A typical ‘Veep’ plot point might be that someone has to work with VIPs but didn’t bring socks to work.” (Through a series of misadventures involving exercise and doctors’ appointments, that actually happened to one of Felten’s colleagues.) While the administration wasn’t a comedy of errors, Felten said small mistakes are noticed and amplified, and overworked and sleep-deprived officials are prone to making them.

The darkest interpretation of Washington power comes from the Netflix series “House of Cards,” where ruthless politician Frank Underwood rises to power through murder and Machiavellian political maneuvers. “It’s a version of Washington driven by deception and limitless will to power,” Felten said. “Will to power is not unknown in Washington … unlike in ‘House of Cards,’ I did not see any actual bloodshed.”

Felten said he saw aspects of all three shows. “I might have been a teeny bit diabolical once or twice, but it was for a good cause of course,” he says.

Understandably, Felten was nervous when it came time to meet his new boss. He soon forgot his jitters, but the importance of the work never left his mind. “The overall intensity and pace of the job is one of the things I’m going to remember most,” he said. Felten was impressed by Obama’s sharp intellect and grasp of the issues being discussed. To prepare for his meeting with Felten’s team, Obama took home a large stack of briefing papers to read. Not only did the president read the papers, but he could quote figures from them months later, Felten says.

One of the major policy areas that Felten and his team worked on was self-driving cars. Teslas all over the country are driving themselves down highways in semi-autonomous “autopilot” mode. Fully autonomous Google cars are being tested, and manufacturers are rapidly putting self-driving features onto new car models. Felten says this entire field never came up during Obama’s first term, but by the end of his second it was becoming a pressing matter.

“Major automakers plan to sell fully automated vehicles on a nationwide scale starting in 2021,” Felten said. “If cars weren’t enough to think about, robot trucks are also on the horizon. Drones are another transportation issue that suddenly popped up in the 2010s. The government’s transportation policies suddenly seemed outdated.

Felten’s team worked with experts at the National Highway Transportation Safety Administration to create a policy for self-driving cars. “The stakes of self-driving vehicles are higher every year,” Felten said. For one thing, they promise a way out of the increasing death toll of road accidents. Every year about 36,000 Americans are killed on the roads, with 1 million people injured. The toll has been rising for the past two years thanks to drivers distracted by cell phones. (U.S. 1, January 25.) The financial cost is hundreds of billions. “People are terrible at driving,” Felten said. “Machines can do it a lot better.”

For regulators, self-driving cars pose a dilemma. They don’t want to allow a self-driving car to take to the roads until it’s safe. On the other hand, they don’t want to be so restrictive they cause delays in the introduction of autonomous vehicles. Furthermore, there is no real way to test the technology except by taking it out on the roads.

Felten played a small role in crafting a federal policy that was released in September, 2016, that set performance standards for self-driving cars and offered guidelines for how states should regulate them. The policy sought to balance the need to set safety standards while allowing manufacturers flexibility in designing and testing their autonomous cars. The policy required automakers to share data on their cars but did not restrict the development of driverless car technology. The autonomous car policy, Felten said, was “an unrecognized success of the Obama administration.”

Another highlight of Felten’s tenure came after the San Bernadino terrorist attack, when Obama addressed the nation on plans to fight terrorism. Felten was one of many whose input shaped the speech, which touched on technology several times: “And that’s why I will urge high-tech and law enforcement leaders to make it harder for terrorists to use technology to escape from justice,” Obama said.

“I spent many hours working to influence what ended up as just a few words in the speech,” Felten said, “and I was just one of many people working on that.”

Just as autonomous vehicles promise to transform the civilian economy, they are also changing the way wars are fought. The U.S. military is investing heavily in drone technology, counting on supremacy in the realm of unmanned vehicles to overwhelm any future enemy.

The military is preparing for a future in which weapons will have the ability to fire themselves, and in which the decision to kill or not to kill a target could be up to a computer rather than a human. If autonomous cars pose a dilemma, military drones raise a thicket of legal and ethical questions. Felten says the Defense Department follows guidelines on unmanned vehicles that follow international humanitarian law, which require distinguishing military targets and hitting them instead of civilians, and considering the risk to non-combatants when deciding to attack a military target.

“The principles of international humanitarian law tell us everything we need to know in principle,” Felten said. “But how do we translate those principles into concrete action?” For example, when is it OK to allow a machine to determine whether a person is a combatant? What if the machine turns out to be better than a human at making that determination?

“These are live questions now,” Felten said. “As automated weapons become more technologically feasible, we need to figure out how to reconcile our goal of having an effective military, a military that protects American interests and values, and how to reconcile that with humanitarian issues.” Future policy will affect how weapons are designed, tested, how personnel are trained, and what the rules of engagement are for troops in the field. “We have some really hard thinking to do here,” Felten said.

In the meantime, adversaries are developing autonomous weapons of their own, without any such ethical concerns. ISIS has turned drones into deadly weapons. One well publicized example was an off-the-shelf DJI quadcopter and a grenade with a badminton shuttlecock that were combined to make a cheap unmanned bomber. The contraption was shot down in late February and a reporter snapped pictures of the improvised weapon outside Mosul.

Countering these threats, as well as the deadlier ones posed by nations developing their own drones, will be a large part of future military strategy. “How might we detect, prevent, or disrupt attempts to use high technology for great harm, internationally or domestically?” Felten said. “These decisions are coming, and we have to figure out how to make them. Technical insight has to be key.”

Behind both self-driving cars and autonomous weapons is artificial intelligence. Advances in AI have allowed these technologies to go from sci-fi books to reality, and even more drastic changes are on the horizon for society. While Felten jokes about the not-very-likely “robot apocalypse” scenario where AI becomes smarter than humanity and wipes it out, there are very real concerns about how it could affect society.

AI could have a major effect on the job market. Of three million Americans who work as drivers, experts believe two-thirds could be put out of work by self-driving cars. And just as computerization put white-collar paper pushers out of work by automating their jobs, AI could come along and replace workers whose tasks include routine analysis. Even well-educated workers would not be safe. For example, Felten said, AI routines have been developed that are as good or better than trained radiologists at looking at medical images for problems.

Different experts have different estimates on the effects of AI automation on the future workforce. One study predicted 83 percent of jobs that pay $20 an hour or less would be affected by artificial intelligence, 30 percent of midrange jobs, and 4 percent of highly paid jobs. Another study by a different group estimated that AI would affect 44 percent of people who worked in highly automatable jobs, and that jobs requiring a bachelor’s degree or higher would be unaffected.

There is much uncertainty when it comes to the future of AI, Felten says, and all the estimates should be taken with a grain of salt. However, there is an overall long-term trend toward less participation in the workforce over time, and Felten points to automation as the likely cause. At its peak in the 1950s, 87 percent of men were working, while today the figure has fallen to just over 50 percent and seems to be on the decline. Aging and economic dips cannot explain the trend, Felten says. Policy makers will have to come up with ways to deal with further potential job loss from AI, even as the technology raises the possibility of a more productive workforce, and the potential economic benefits and increase in standard of living that go along with it.

Automation has already affected the job market deeply. Felten says automation has caused more job loss than free trade. It’s up to the government to manage the fallout from these technological changes.

“Turning away from the technology is not the right approach,” Felten said. “Our economy can absorb large changes and large movements of workers in the long run, and they can emerge stronger than ever. But we can’t lose sight of the fact that these dislocations can cause real pain for the workers who are affected … forces pushing towards economic inequality will not solve themselves if we do not have the right policies in place.” One possibility that has been floated is a national minimum income. Felten noted that it would not be cheap to provide everyone in the country with a poverty-level subsistence income. In fact, it would double the federal budget.

Felten along with most of the Obama team, will not be making those decisions. With Donald Trump in the White House, a new set of advisers and experts has come in to ponder those thorny issues. Felten said it’s not at all certain that these advisors, despite coming from the opposite party, will take contrary positions to what Felten and his team did. Many of the issues he worked on are non-partisan in nature.

Speaking to a crowd of Princeton computer science students, many of them likely destined for Silicon Valley, Felten urged them to consider the consequences when working on AI technology that could deeply impact society.

“We would like this technology to develop in a way that is sustainable, meaning that it’s developed in a way where people can accept it, where we have been thoughtful about mitigating the negative consequences that it might have, and where people who are not technologists and don’t know about the technology don’t feel so much that the world is getting out of control,” Felten said. “It’s for all of us to do this work.”

Felten online: freedom-to-tinker.com

Facebook Comments