The Problem with Videoconferencing
I had my first videoconference experience in about 1999 – which is a century ago in the evolution of most technologies. Today, I use some form of videoconferencing almost on an daily basis. While there have been vast improvements in sound and image quality and the technical specifications, the overall experience is really not that different. I still feel like I’m watching television. And that’s the problem.
The way I envision it is this: Somewhere in the early stages of the development of two-way video interaction, some scientist looked to the most obvious and miraculous “metaphor” of the time — television. After all, cameras and CRTs already existed. So did microphones, loudspeakers and broadcast transmission technologies. Why not combine television and telephony and create a way to see and hear when you communicate from remote locations? It all made sense. Except that television was – and for the most part remains – a one-way, passive form of communication. In fact, it barely qualifies as communication at all, since there is very little exchange of information. Effective videoconferencing in enterprise applications depends on effective and natural two-way communication, or it falls short on its promise.
I suppose the problem is to a large extent generational. While my perception of television has been passive watching and one-way communication, that is changing. After all, we have Wii and Kinect and myriad technologies that allow you to interact with your TV. But in the end, you’re still interacting with a square box. And it’s true that my generation of digital immigrants lags behind my kids’ generation of digital natives in terms of technology lifestyle. But how close can the TV-bound model ever really get to delivering the true promise of videoconferencing and “telepresence” – the experience of “being there” in the conference room? Researchers are continually refining the technologies and getting incrementally closer all the time, but fundamentally. most research continues to use cameras, displays, and audio transmission technologies to simulate being there.
Charles Fort had a better idea. He’s credited with coining the term “teleportation” in 1931 (before the commercialization of television). Later in 1966, Gene Roddenberry envisioned the idea in the TV series (irony noted) Star Trek. Just last year, researchers at the University of Tokyo and the University of New South Wales actually demonstrated what’s known as “quantum teleportation,” which involves teleporting information. Although it’s a long way from “Beam me up, Scotty,” it does provide a glimpse of what might be yet to come. At the exponential rate of technological advancement that futurists like Ray Kurzweil talk about, “beam me up” teleportation may actually one day become reality.
And certainly, teleportation would be a better realization of the “being there” model than the “TV model” that we’ve learned to accept. Of course, it would be a disruptive technology the likes of which we’ve rarely seen, and it would re-define (or obsolete) videoconferencing and the travel industry in transformative ways. But as Charles Darwin might have said, if you can’t adapt to change in life, then go home and watch it on television.