When I began my career in the 1980s, artificial intelligence (AI) was the stuff of science fiction and science fantasy. We never thought Hal from 2001: A Space Odyssey would actually exist.
Hello Alexa? Are you there Siri?
In my first jobs in broadcast journalism we wrote our stories on manual typewriters – we plucked at the noisy keyboard just like other “knowledge workers” at the time.
For those of us whose work focused heavily on the written word, the introduction of word processing (a.k.a. Microsoft Word) was true liberation – edits could be made live on screen before our eyes, typos could be fixed instantly.
Text documents were suddenly searchable in an instant. Hallelujah! When digital video came in the 1990s, it was equally transformative. Images could be rearranged just like paragraphs of text on a computer screen.
Now we are in the midst of the AI revolution. Its magical powers should be sweeping us off our feet as it transforms how we work. But despite the hype, this revolution hasn’t yet been as transformative as word processors or digital video.
Why hasn’t AI been adopted as quickly as we thought?
A lot of artificial intelligence is scary stuff, automating work that was once done by humans. Robots are replacing people in assembly plants. Self-driving cars grab headlines when they drive into people.
Content on news websites is increasingly curated by AI, not people, and the mistakes made by AI end up on the news. On a more benign level we’re seeing news stories about quarterly business earnings and sports events produced by bots.
Then there’s the creepy AI that uses facial recognition and combs images on social media to search for a match.
Clearly the advances in AI are far ahead of regulation. As Stephen Hawking said in 2016, “The rise of powerful AI will either be the best or worst thing to happen to humanity.”
The best way to combat AI’s image problem is simple: create AI-powered tools that solve people’s problems. The market may be hesitant, but there’s a lot of opportunity: according to a report on AI by CompTIA, only 29% of businesses currently use AI.
Do people fear AI? Or do they just not know how to use it? The idea of complex AI algorithms might scare some people the same way a course in calculus would, but those people don’t need to understand all the details of how AI works to benefit from it.
Instead, businesses need connectors between the technology that use AI as the foundation. They need platforms that apply powerful AI technology in a simple way. When people can see the benefits of applied AI, they can start to see AI as a positive influence in the world.
That’s just part of the story. If more companies used artificial intelligence to liberate people from the menial parts of their work so that they can focus on the meaningful, then maybe we could start to change people’s perception of AI.
As the CEO and founder of an AI-based platform, I’ll get asked, “What’s it like taking people’s jobs away?” My answer: “I have no idea, because we’re not doing that. Our software helps people do their jobs better.”
Technology has evolved to the point where it can take care of repetitive, monotonous tasks. The most valuable asset to any company is its workforce; AI can (and should) be the liberator that lets people do what they’re best at.
Junk email should be automatically filtered into a Trash folder by a robot, not individually sifted through by a person.
Credit card companies should be alerted when a customer’s card appears to have been stolen, not employ thousands of workers to pore over transaction history and send up a flare when they think a purchase looks fishy.
AI gets rid of the grunt work that humans shouldn’t have to do. Not all companies that leverage AI are not trying to take away jobs.
In the content production world, for example, they’re allowing people to focus their skills and their time on the things that really matter: producing smart, engaging content.
Customers need to know their content is secure from hackers and won’t be looked at by anyone. That’s why it’s also important that companies using AI make security a non-negotiable. Privacy concerns haunt companies that deal in the valuable currency of customer data, like Facebook.
These platforms have advertising engines that are fueled by shared posts, tagged photographs and user networks; many people have concerns that those advertising engines are also available to politicians and state-run organizations.
All companies that use artificial intelligence should have the same, simple policy: no one sees your data but you. People don’t want their data scraped by bots without their permission.
Government regulation is overdue as new ways to apply AI are discovered and brought to market. Letting big tech companies set their own standard for data security leaves the decision in the hands of the people who benefit most from increased access to customer data.
It hasn’t worked. Only when people feel secure can AI’s image start to change from something scary and unknown, to a new opportunity.
A lot of people are talking about AI today. But in business, the revolution in content production has just begun.
Imagine a world that offers not just instant voice recognition, transcription and translation, but also a menu of instant metadata like sentiment analysis, speaker recognition, image recognition, topic extraction and so much more.
This will push the quality and speed of content production higher and higher.
Artificial intelligence isn’t taking jobs away. It’s giving people their jobs back.
Jeff Kofman, CEO and founder of Trint, is a tech entrepreneur with an unusual backstory. As an Emmy award-winning network television news foreign correspondent and war correspondent with ABC, CBS and CBC News he spent more than three decades reporting from around the world. Jeff has covered many of the biggest stories of our time including the Iraq War, the Arab Spring, Hurricane Katrina, the Gulf Oil Spill and the Chile Mine Rescue. He’s won an Edward R. Murrow Award, a duPont Award and two Emmys, including one for his coverage of the fall of Muammar Gadhafi in Libya in 2011.Reblogged 1 year ago from www.clickz.com