Focused on the human equation
As a PhD Student, David Melville worked at The MacDiarmid Institute with Professor Richard Blaikie investigating photolithography. He has recently shifted into the ﬁeld of what IBM call Cognitive Computing.
AS A PhD STUDENT, David Melville worked at The MacDiarmid Institute with Professor Richard Blaikie investigating photolithography—the process by which computer chips are made. Now he works for IBM Their aim was to prove the controversial theory of the perfect lens, or the superlens. The superlens goes beyond the diffraction limits of conventional optical devices. Conventional lenses capture only propagating light waves— those waves that travel from a light source or an object to a lens or the human eye. But as a result, the image formed by the diffracting light is slightly blurred. The superlens would capture what is known as evanescent light, or light that decays. “So this would mean you can capture information at a greater distance from the object,” says Meville. “What is really important about that light is that it holds very high resolution information about that object … this relates directly to our ability to transfer very high resolution images, on very small computer chips.” Their breakthrough work, which involved using near-ﬁeld lithography to make nanoscale silver gratings, was published in 2005 and has been cited more than 200 times. The research would never have been possible without the Institute’s nanofabrication technology, he says. “The equipment allowed us to fabricate or replicate that photolithography process; it was state of the art technology. The MacDiarmid Institute gave us the framework to be able to do that sort of work … We were always hands-on with everything we did, with these big expensive machines,” he recalls. In the USA, where he is now based, PhD students would be unlikely to actually operate such technology. “In New Zealand there is much more of a ‘get your hands dirty’ approach.” After graduating with his PhD, Melville took up a postdoctoral role at IBM in the Big Apple, where he still works—the company has around a dozen research labs around the world. “Covering basically every area of science, but with a strong leaning towards physical science.” Initially he continued his work in photolithography—on the continual scaling down of microprocessor production abilities. More recently he has worked on smart grids in the energy industry. As he notes, this is an industry in which the companies involved are responsible for the equipment that modern life is dependent upon, such as transformers, power poles, transmission wires, etc. It is also dependent on people responding to events that affect those assets. Melville’s work has been in the area of analytics; using computers to construct a model of an organisation’s infrastructure in order to predict how it could be affected by unexpected forces, such as a hurricane, or sudden changes in power use. The work included a focus on visualisation—working out how that information could be best conveyed to people operating those assets, in a way that people could easily understand. “When you are talking about information regarding a large number of assets, you need to allow people to make decisions about dealing with those assets very quickly.” He has recently shifted into the ﬁeld of what IBM call Cognitive Computing. What distinguishes cognitive computing systems from Artiﬁcial Intelligence (AI), is that humans are part of the equation. “This is what we term the partnership focus—ﬁnding the best way to make humans and computers work together, to get the best out of both and the best end result.” He is looking at the human side of things—how the human is best represented in this system. Predictions for the not-too-far-away future? Think of the motion-sensing technology in the movie Minority Report. “So the next step will be separating you from the device, and all you’re doing is talking and moving your hands around and interacting as you would if talking with people.”