This has got me to thinking about how technology seems to march on in some areas at leaps and bounds. Specifically I'm referring to the silicon that runs electronics. It has advanced at a predictable rate for over 30 years now. Basically our ability to fit transistors on a specified chip size has doubled every 2 years or so. By some accounts it doubles every 18 months.
What that means for the users of computers, which anymore is just about everyone, is that every couple of years our computer devices get about twice as good. Twice as good at doing things that you don't really notice usually. But over the last couple of years they have become better and better and using less electricity to accomplish the same tasks. This is essentially what allows us to have our iPhones and Androids.
The other major component, the battery is slightly different. Over the last 30 years batteries really haven't improved all that much. When I was a little kid you could buy rechargeable batteries. Most of these were nickel cadmium based. They work pretty well but they generally have less power than their alkaline counterparts. If you think back to that portable cassette tape player you had to have in 1984 you might recall that a set of Duracells would last longer than a set of rechargeable batteries. Then sometime in the mid to late 90s nickel metal hydride batteries became all the rage. They packed more punch for the pound and were much lower maintenance. But if you can recall cell phones from that era you will remember that they were usually huge.
Today the most common batteries are lithium based. There are a few different types in use but they all perform about the same. We use these because chemists have known for a long time that lithium packs more power per pound than nickel based batteries. In truth chemists know which elements would make the best batteries and have for a long time. But the march towards the better batteries has been slow.
So here is why I believe that is: science marches to the beat of the hypothesis while technology marches to the beat of necessity.
A typical science project starts out as an idea someone has. They might write something like: Perhaps A+B will produce X. They then design experiments to see if the proposed hypothesis is true or false and then they test it. They reserve the "deliverable" section for an afterthought. What I mean is that in scientific proposals the deliverables section is usually at the end and it usually is not a product. More often than not the only promise of a hypothesis is a process by which to replicate the intended results.
What I am saying is that science often doesn't move towards practical ideas and useful materials. Often it marches in random directions just to "see if it's possible." For example, a geneticist of sorts hypothesizes that it would be possible to grow a third leg on a frog. They design an experiment and eventually grow a third leg on a frog. Nowhere along the way do they bother to ask if it is even worth doing or if it will produce a useful product or add useful information to collective knowledge. Often the point of science is just to see if something can be done.
Technology on the other hand out of necessity has been mostly excluded from this process. Technology research is extremely expensive. It is currently highly profit driven and has been for some time. So technology often races after money. Money is made by selling stuff. You sell stuff if it will do the required job.
About 15 years ago I noticed a funny trend in computing technology. I was a teenager and I loved to play video games. They were at the time becoming highly sophisticated. This aggravated my father. He did not like that I would often take over his office computers after hours to play Star Craft and Age of Empires with my cousins. If only he knew how many times I broke his computers trying to get games to work on his network.
Anyway, I noticed something funny about computers at the time. Companies that manufactured parts would make something really cool and super powerful. A week after it was released some software would come along that would push that hardware to the very edge of what it could do. A few more weeks and it was almost useless to the newest games on the market. Often game makers would build a way for end users to tone down the game so it would not eat the computer's resources. There were some games that took up to a year before parts makers could make stuff that could really do it justice.
So technology marched on at a quick pace because software engineers, specifically game designers, wanted to do more and show their audience more. Hardware could barely keep up just trying to please the customers.
The same thing is driving the market now. It's less pronounced because in some sense the hardware has caught up. Console game machines can now reproduce beautiful scenes within games. PCs have moved along to the point where a gaming laptops are highly affordable, I'm writing this on one I got for $750.
iPad are going to be more and more important to every day users. These all rely heavily on batteries. The whole idea behind these things is mobility. So technology has moved towards that in high gear. New chips are being designed to be very powerful and consume little power.
The issue arises that batteries are not marching on as quickly. New technologies are being developed for batteries. Sadly they are mostly backed by universities that follow the hypothesis system. For instance, a new battery innovation was invented at Stanford about 3 years ago that would increase batteries charge time by about 8 times. The original scientist that made this new innovation said at the time it would take 5 years to get the innovation into production. As of today they are "expecting" a prototype of this battery for cell phones to be made available this year. Not a production unit, a prototype. 3 years to deliver a prototype. Why?
Simply put, it is not in the best interest of scientist to deliver products. Many times scientists work for institutions that retain part or all of the patents developed by their employees. So innovations don't get put out to market because the scientist gets funded to do research not paid royalties for each battery made using his process. So scientists would often rather come up with hypotheses that take years and years to prove or disprove simply to have job security and long term funding.
If Intel were researching this battery they would have had some kind of production unit with 12 months of announcing the idea of the new battery. If AMD were doing the same thing they might get something out in 9 months simply to beat Intel to the punch. If ARM holdings were behind it they would have drawn up plans for how to do it and would have sold licences to Texas Instruments, IBM and Nvidia and we would probably be seeing 8 versions out by Christmas time inside the best new cell phones.
Would it make you sad to know that while Stanford is making a battery with 8 times the charge of current cell phone batteries there are other companies out there making sodium batteries that are even better? Some of these batteries have been in R&D for a decade. There is one substance that science loves to talk about called carbon nano tubes that could potentially improve current batteries by 100 to 1000 times. This stuff was discovered in the 90s. It's a favorite in the research world. What I don't get is why nobody is researching mass production of it. I mean there are a few out there that do but why isn't energizer or Intel or IBM doing whatever they can to mass produce this stuff. Not only does it have the potential to increase battery power but it also has the potential to decrease the power consumed by electronics by around 100 times.
Imagine that for a minute and you might understand my distress. A cell phone that lasts 100 times longer on a single charge because it uses far less electricity to perform the same function plus a battery that is 100 to 1000 times smaller.