Friday, January 19, 2018

Paul's Update Special 1/19




Motherboard has called Ray Kurzweil “a prophet of both techno-doom and techno-salvation.” With a little wiggle room given to the timelines the author, inventor, computer scientist, futurist, and director of engineering at Google provides, a full 86 percent of his predictions — including the fall of the Soviet Union, the growth of the internet, and the ability of computers to beat humans at chess — have come to fruition.

Kurzweil continues to share his visions for the future, and his latest prediction was made at the most recent SXSW Conference, where he claimed that the Singularity — the moment when technology becomes smarter than humans — will happen by 2045. Sixteen years prior to that, it will be just as smart as us. As he told Futurism, “2029 is the consistent date I have predicted for when an AI will pass a valid Turing test and therefore achieve human levels of intelligence.”

Kurzweil’s vision of the future doesn’t stop at the Singularity. He has also predicted how technologies, such as nanobots and brain-to-computer interfaces like Elon Musk’s Neuralink or Bryan Johnson’s Kernel, will affect our bodies, leading to a possible future in which both our brains and our entire beings are mechanized.

This process could start with science fiction-level leaps in virtual reality (VR) technology. He predicts VR will advance so much that physical workplaces will become a thing of the past. Within a few decades, our commutes could just become a matter of strapping on a headset.

As Inverse points out, this paradigm shift could have some interesting consequences. Without the need for people to live close to work, we could see unprecedented levels of deurbanization. People will no longer need to flock to large cities for work or be tethered to a specific location. Inverse suggests that this decentralization may decrease the opportunity for terrorist attacks. Blockchain technology will continue to bolster decentralization as well.
According to Kurzweil, technology will not only enable us to rethink the modern workplace, it will also give us the ability to replace our biology with more substantial hardware. He predicts that by the early 2030s, we will be able to copy human consciousness onto an electronic medium.

Not all of Kurzweil’s predictions are so drastic, and some seem even more likely to come to fruition. For example, his prediction of truly ubiquitous WiFi is well on its way to becoming reality, especially with Elon Musk’s announcement that he hopes to beam the internet across the globe from space, and his belief that many of the diseases currently plaguing humanity will be eradicated by the 2020s also seems remarkably possible given ever more frequent medical breakthroughs.

Kurzweil envisions a future that is exciting, daunting, and a little bit terrifying all at once. Time will tell if his impressive batting average will improve or if the future has other plans for humanity.




Having a four-year college degree is generally regarded as a necessity to score a job in tech. But as the number of tech jobs has climbed, far outpacing the number of applicants, companies like IBM have turned to talent with non-traditional educational backgrounds.

With this drastic shortage of tech workers, the company is now focusing on skills-based hiring rather than credentials to fill these roles.
In a USA Today column, the company's CEO Ginni Rometty explains that not all tech jobs require a college degree. As industries transform, she says, "jobs are being created that demand new skills – which in turn requires new approaches to education, training and recruiting."

These "new collar jobs," says Rometty, are becoming harder to fill. In the U.S. alone, there are more than 500,000 open jobs in tech- related sectors, according to the U.S. Department of Labor. A recent study by Code.org reports that as many as 1 million programming jobs will be unfilled by 2020.

To counter this, IBM's CEO says that the company intends to hire 6,000 employees by the end of this year, many of whom will have unconventional backgrounds.
"About 15 percent of the people we hire in the U.S. don't have four-year degrees," IBM's vice president of talent Joanna Daly tells CNBC Make It. In June, the company announced that it would be partnering with community colleges across the U.S. to better prepare more Americans for "new collar career opportunities." For those who don't have a bachelor's degree, Daly says she likes to see hands-on experience and that you've enrolled in vocational classes that pertain to the industry you're applying to.

"New collar" jobs not only bring in candidates who built skills through coding camps, community colleges or modern career education programs, but they also attract veterans and those reentering the workforce or relaunching their career, says Daly.

Daly adds that it's important for companies to recognize that there are different ways to get jobs in tech and different qualifications. When reviewing applicants for open positions she says to ask, "Do I really need a four-year degree for this?"

IBM's CEO says that an open position does not require a bachelor's degree in many cases.

"At a number of IBM's locations...as many as one-third of employees don't have a four-year degree," Rometty writes in her column. "What matters most is that these employees...have relevant skills, often obtained through vocational training."




The research firm eMarketer estimates that 60.5 million people in the U.S.—a little less than a fifth of the population—will use a digital assistant at least once a month this year, and about 36 million will do so on a speaker-based device like Amazon Echo or Google Home. These things are most popular among people age 25 to 34, which includes a ton of parents of young children and parents-to-be.

And these techno-helpers are not just going to get more popular; they will also get better at responding to queries and orders, and they’ll sound more humanlike, too. At the same time, young users will become more comfortable and sophisticated with the technology. They’ll request help with homework or control devices around their home.

Interest in digital assistant jibes with some findings in a recent MIT study, where researchers looked at how children ages three to 10 interacted with Alexa, Google Home, a tiny game-playing robot called Cozmo, and a smartphone app called Julie Chatbot. The kids in the study determined that the devices were generally friendly and trustworthy, and they asked a range of questions to get to know the technologies (“Hey Alexa, how old are you?”) and figure out how they worked (“Do you have a phone inside you?”).

Cynthia Breazeal, one of the researchers and director of the Personal Robots Group at MIT’s Media Lab (as well as cofounder and chief scientist of the company developing an AI robot called Jibo), says that it’s not new for children to anthropomorphize technology. But now it’s happening a little differently.

For young kids who can’t yet read, write, or type but can talk a mile a minute, voice-operated assistants could help build social skills and push boundaries—two things that are key to a child’s development. If nuances in the user’s tone can affect how the digital servants respond—which is not that unlikely in the near future—it’s possible that kids who use them will become more adept at communicating with others (be the others humans or robots).
What about older children? Will they get bossy and bratty from the habit of ordering Alexa around? Probably not, says Kaveri Subrahmanyam, a developmental psychologist and chair of child and family studies at California State University, Los Angeles. But she does wonder whether having digital butlers will reduce kids’ ability to do things for themselves. “I don’t think we have to be worried about it or paranoid about it, but I do think it’s something to be watchful for,” she says.

The other researchers I spoke to aren’t too worried either. “There’s this notion that if all this technology was turned off, everything would be great. We’d be interacting all the time, we’d be reading all the time,” Vandewater says. “I just don’t believe that.”

In fact, maybe the opposite can be true. Perhaps growing up with Alexa will actually make technology less distracting, enabling it to, in a sense, fade into the background; we’ll get what we need from it, and then move on with our lives until we come back with another request.

No comments:

Post a Comment