Thursday, February 18, 2010

Dubai Assassination Caught on Surveillance

Big news this week was the assassination of a Hamas official in Dubai by a team of more than 10 assassins. What was particularly striking to me was how much of the unfolding event was captured on surveillance video. I've often asked my students how much of their lives they think is captured on video -- certainly not as much as in a place like Dubai. But whatever amount it is, it will only be increasing.

There are at least two significant issues this raises: first is the social issue of privacy. It's not exactly that surveillance makes private things public; however it does turns public events that would have been transient into permanent records. What will the social cost be?

Second is the technological issue of how to deal with the deluge of data. Not just how to store it, but what to do with it. Imagine a day when intelligent software can monitor the massive flow of data from ubiquitous video, audio, and other sensors. Will we be able to prevent crimes and terrorist acts by catching the would-be perpetrators in advance? Starting to sound a bit like "Minority Report!"

Image credit: http://blogs.edweek.org/edweek/LeaderTalk/

Real-time language translation coming soon?

Google developing a translator for smartphones

It's hard enough for machines to translate text; harder to translate spoken language; and harder still to do it in real time. But tech behemoth Google is working on technology to do exactly that.
"Head of translation services at Google, Franz Och, said he believed almost instant speech-to-speech translation should be possible if the accuracy of voice recognition and machine translation can be improved. He said Google is working on this, and he expected the technology to “work reasonably well” in a few years. There has been a great deal of progress in voice recognition and machine translation in recent years, especially the latter, thanks to funding from the U.S. Defense Advanced Research Projects Agency (DARPA)."

Robots will replace all workers in 25 years

Cisco Systems Futurist Dave Evan has no shortage of shocking projections about the future of technology and the human race. Among them:
  • artificial brain implants
  • data created at a rate of 92 million Libraries of Congress per year
  • cost of storage dropping below $10 per petabyte (at that price, you could create a high def video of every second of your entire life for less than $100, assuming you don't live past 130!)
  • instantaneous language translation
  • intelligent machines with feelings

See more at itWorld Canada

Saturday, February 13, 2010

Programming methods for multicore

There have been many advances in programming paradigms (functional, object-oriented) and programming methodologies (agile, extreme); but much development is still done with a single or a small number of processors in mind. This has to change in order to take advantage of the power of multicore machines. Researchers at Berkeley are aiming to change this by developing new parallel methodologies.
Using [their] approach, one team created a program that reduced the time needed to create an MRI image from one hour to one minute. The code is already being used at a local children's hospital.
In another example, the approach reduced the time to handle object recognition from 222 seconds on an Intel Nehalem processor to 1.8 seconds on a massively parallel Nvidia GTX 280 chip. Other efforts in areas including speech recognition, option trading and machine learning showed results ranging from 11 to 100-fold performance gains.

See more on Berkeley's progress in parallel programming.

Will Keyboards give way to Brain Control Interfaces?

Mind Over Matter... ExtremeTech

I remember sitting in typing class back in high school, wondering "when in the world am I ever going to use this skill?" I couldn't have envisioned how computers would change things.

Despite the rapid advancement of computers, a few elements have remained near universal: display screen, keyboard, pointing device. These components have been such an integral part of computing since early days that it's hard to imagine a computer without them. Prediction: this is certainly going to change. Some new input technology will come along and keyboards will go the way of rotary-dialed phones. We don't know yet what that new technology will be, but it's coming.

An intriguing possibility: direct brain interfaces. If we could master two-way neural/electronic interfaces, we could get rid of all three: keyboard, mouse, AND monitor.
"The ability to influence the physical world merely by thought has been a dream of mankind for many years. Now researchers are making real progress in letting people control a PC simply by thinking, and the first crop of consumer Brain Control Interface (BCI) headsets has arrived. Right now these are being used only for simple games, and hardware and applications to support the technology are scarce. But this still represents a major advance that could significantly change how we all interact with computers."

As the story goes on to describe, a natural application of brain control interfaces (BCIs) is to replace lost functionality in people who have suffered loss of motor control. Once BCIs can be used replace lost functionality, it's almost inevitable that they will be used to introduce new functionality -- built-in GPS, anyone? Human calculators? Eidetic memory? Why not control the world around you with a mere thought? Turn lights on and off. Set the thermostat. Change the channels. Sounds spurious? Maybe. But how many of today's conveniences would have sounded the same to earlier generations?

Friday, February 12, 2010

Donate your computing power to map the Milky Way

PCs Around the World Unite To Map the Milky Way
"At this very moment, tens of thousands of home computers around the world are quietly working together to solve the largest and most basic mysteries of our galaxy.

"Enthusiastic and inquisitive volunteers from Africa to Australia are donating the computing power of everything from decade-old desktops to sleek new netbooks to help computer scientists and astronomers at Rensselaer Polytechnic Institute map the shape of our Milky Way galaxy. Now, just this month, the collected computing power of these humble home computers has surpassed one petaflop, a computing speed that surpasses the world’s second fastest supercomputer."
This "crowd-sourcing" approach was used by the Search for Extra-Terrestrial Intelligence early on, and more recently was responsible for finding a new largest Mersenne Prime.

Image credit: NASA/JPL/Caltech/R. Hurt (SSC)

First chess, now Jeopardy?

IBM is working on a Jeopardy-playing computer that can beat human contestants. Though it can't yet beat the best human players, it's pretty amazing that it can do everything it needs to do. Consider:

Without being connected to the Internet, the computer has to understand natural language, determine the answer to a question (or, in the case of Jeopardy, the question to an answer), and then calculate the odds that its answer is correct in order to decide whether it is worth buzzing in.

The fear of getting a question wrong and losing money prevents many a wrong answer from a human Jeopardy contestant. At the same time, humans often instinctively know they know the answer to a question, even if it doesn't pop into their heads right away. So a human Jeopardy player will often click the buzzer upon hearing the question, and then spend the next several seconds pulling the answer out of the memory bank. To compete on Jeopardy, a computer must determine whether it knows the correct response within seconds.



See the original story at IBM's Jeopardy-playing machine can now beat human contestants

Thursday, February 11, 2010

The Emotional Computer

Can your computer make you happy?

Researchers in the Affective Computing Group at the MIT Media Lab are trying to improve your computer by giving it emotions -- or rather, the ability to detect them. Plenty of people get mad at their computers. What if the computer could sense that, and respond by changing the way it worked?
"We're not talking about handing a global missile defence system over to a PC with a lot of feelings, but we are talking about improving the interface between humans and computers by making computers easier and more intuitive to interact with. For example, computers that sense when a user is getting frustrated could try a different way of explaining how to troubleshoot the internet connection. Cars that can tell when a driver is about to fall asleep could sound an alarm."

"The laboratory for Affective Computing at The Massachusetts Institute of Technology puts it like this: "Emotion is fundamental to human experience, influencing cognition, perception, and everyday tasks such as learning, communication, and even rational decision-making. However, technologists have largely ignored emotion and created an often-frustrating experience for people, in part because affect has been misunderstood and is hard to measure."

At some point in the future, will your computer get mad at you for the way you're acting? We'll have to wait and see.

Saturday, February 6, 2010

Can we trust computers to drive our cars?

The Electronic Systems That Make Modern Cars Go (and Stop)

We don't have to look to the future to see the rise of autonomous vehicles; computers have already taken over many of the low-level functions of cars.
"The electronic systems in modern cars and trucks — under new scrutiny as regulators continue to raise concerns about Toyota vehicles — are packed with up to 100 million lines of computer code, more than in some jet fighters."

"Even basic vehicles have at least 30 of these microprocessor-controlled devices, known as electronic control units, and some luxury cars have as many as 100.

"These electronic brains control dozens of functions, including brake and cruise control and entertainment systems. Software in each unit is also made to work with others. So, for example, when a driver pushes a button on a key fob to unlock the doors, a module in the trunk might rouse separate computers to unlock all four doors.

Will faulty brake controllers cause us to reject computer-controlled components? Not a chance! For better or worse, our society regularly determines acceptable levels of risk. As long as computer-controlled devices provide sufficient value to outweigh the risks, we will accept them. In most cases, computer-controlled devices actually reduce risks when compared to fallible human controllers.

For more on this topic, see this NPR story entitled Despite Glitches, Electronics Make Cars Safer.

Wednesday, February 3, 2010

IBM Preps Carbon Transistors

Carbon Transistors for Post-Silicon Era

Moore's law may be running out of steam for silicon-based semiconductors, but IBM is working on the next big thing. Other alternatives might include DNA-based or liquid chemical computers.
"IBM Research demonstrated a carbon-based transistor technology that could make obsolete silicon-based CMOS chips over the next decade."
Graphene solves the No. 1 problem with today's silicon chips: Namely, that as we make them smaller and faster, they generate more and more heat. Carbon, on the other hand, harnesses quantum effects to reverse that trend, consuming less and less power as chips are made increasingly smaller and faster.
Graphene sheets also have higher carrier mobilities (the speed at which electrons are propelled by a given voltage). Carrier mobilities that are hundreds of times larger than silicon chips should translate into equally faster chip speeds for graphene.

Mind-reading in consumer devices

Mind-control mobile-phone games Anyone ready for a driect brain interface?

"Mobile-games researcher Dr Paul Coulton and PhD student Will Bamford of InfoLab21 recently unveiled their new game ‘Brain Maze’, in which players use ‘tilt’ controls and a brain-wave reading headset to progress a marble around a course.

"At key checkpoints around the maze, the accelerometer-equipped phone picks up electromagnetic waves from the player’s brain - ‘Brain Maze’ uses alpha waves, which are associated with a meditative state, and beta waves, which are associated with an attentive state, to control access through the ‘mind gates’ that form part of the game."

The game is controlled by NeuroSky’s MindSet brainwave-interface headset, which retails for just $199.

'Smart dust' to explore planets

'Smart dust' to explore planets

Space exploration is a perfect place for using smart dust and other nanotech. Rather than sending huge, expensive, complex devices to other planets (think Mars Rovers, which nevertheless are pretty cool) why not send a swarm of nanodevices that can be dispersed over a large portion of the planet. This approach could be much more robust -- half the sensors could fail, and we'd still receive plenty of interesting info. BBC News reports:
"Tiny 'smart' devices that can be borne on the wind like dust particles could be carried in space probes to explore other planets, UK engineers say. The devices would consist of a computer chip covered by a plastic sheath that can change shape when a voltage is applied, enabling it to be steered.
Even better, use nanotech von Neumann probes (devices that are capable of self-replication). In theory, we could send a couple hundred such probes to a place like Europa, and they would use the raw materials found there to replicate themselves, resulting in massive coverage from a small initial payload.

Human Augmentation and Future Soldiers

Future Soldiers: Brain Boosters, Exoskeletons and Digital Buddies

Today's technologies to replace lost functions (cochlear implants, vision implants, advanced prosthetics) will inevitably lead to technologies that add new functions that we've never had (Human GPS? Night vision? Super strength? Thought-controlled robots?).
"The soldiers of the future might controversially boost their brains with drugs and prosthetics, augment their strength with mechanical exoskeletons, and have artificially intelligent 'digital buddies' at their beck and call, according to the U.S. Army's Future Soldier Initiative.

"The project is the latest attempt from the U.S. Army research lab in Natick, Mass., to brainstorm what soldiers might carry into the battlefield of tomorrow. A special emphasis of its concept is augmenting mental performance."

Monday, February 1, 2010

Smart Dust Coming?

Advances Bring ‘Smart Dust’ Closer

The idea of "Smart Dust" first caught my attention in the work of Vernor Vinge (one of my favorite authors) in his 1999 novel A Deepness in the Sky.

I've posted recently about the incredible miniaturization of RFID technology, but now Hewlett-Packard is pursuing a project "to embed up to a trillion pushpin-size sensors around the globe." Pushpin-size is pretty big compared to smart dust; but its a step...
Last year, Hewlett-Packard began a project it grandly calls “Central Nervous System for the Earth,” a 10-year initiative to embed up to a trillion pushpin-size sensors around the globe... Microchip-equipped sensors can be designed to monitor and measure not only motion, but also temperature, chemical contamination or biological changes. The applications for sensor-based computing, experts say, include buildings that manage their own energy use, bridges that sense motion and metal fatigue to tell engineers they need repairs, cars that track traffic patterns and report potholes, and fruit and vegetable shipments that tell grocers when they ripen and begin to spoil.