April 29, 2011

Science vs Technology

   The so called smartphone is a handheld conundrum for me. On the one hand it is a concise summary of modern consumer technology and our ability to pack nearly all our daily computer needs into a small compact device. On the other hand the technology is limited by a small chemical reaction package that seems to improve at a snails pace. I'm talking about the battery of course.
   This has got me to thinking about how technology seems to march on in some areas at leaps and bounds. Specifically I'm referring to the silicon that runs electronics. It has advanced at a predictable rate for over 30 years now. Basically our ability to fit transistors on a specified chip size has doubled every 2 years or so. By some accounts it doubles every 18 months.
   What that means for the users of computers, which anymore is just about everyone, is that every couple of years our computer devices get about twice as good. Twice as good at doing things that you don't really notice usually. But over the last couple of years they have become better and better and using less electricity to accomplish the same tasks. This is essentially what allows us to have our iPhones and Androids.
   The other major component, the battery is slightly different. Over the last 30 years batteries really haven't improved all that much. When I was a little kid you could buy rechargeable batteries. Most of these were nickel cadmium based. They work pretty well but they generally have less power than their alkaline counterparts. If you think back to that portable cassette tape player you had to have in 1984 you might recall that a set of Duracells would last longer than a set of rechargeable batteries. Then sometime in the mid to late 90s nickel metal hydride batteries became all the rage. They packed more punch for the pound and were much lower maintenance. But if you can recall cell phones from that era you will remember that they were usually huge.
   Today the most common batteries are lithium based. There are a few different types in use but they all perform about the same. We use these because chemists have known for a long time that lithium packs more power per pound than nickel based batteries. In truth chemists know which elements would make the best batteries and have for a long time. But the march towards the better batteries has been slow.
   So here is why I believe that is: science marches to the beat of the hypothesis while technology marches to the beat of necessity.
   A typical science project starts out as an idea someone has. They might write something like: Perhaps A+B will produce X. They then design experiments to see if the proposed hypothesis is true or false and then they test it. They reserve the "deliverable" section for an afterthought. What I mean is that in scientific proposals the deliverables section is usually at the end and it usually is not a product. More often than not the only promise of a hypothesis is a process by which to replicate the intended results.
   What I am saying is that science often doesn't move towards practical ideas and useful materials. Often it marches in random directions just to "see if it's possible." For example, a geneticist of sorts hypothesizes that it would be possible to grow a third leg on a frog. They design an experiment and eventually grow a third leg on a frog. Nowhere along the way do they bother to ask if it is even worth doing or if it will produce a useful product or add useful information to collective knowledge. Often the point of science is just to see if something can be done.
   Technology on the other hand out of necessity has been mostly excluded from this process. Technology research is extremely expensive. It is currently highly profit driven and has been for some time. So technology often races after money. Money is made by selling stuff. You sell stuff if it will do the required job.
   About 15 years ago I noticed a funny trend in computing technology. I was a teenager and I loved to play video games. They were at the time becoming highly sophisticated. This aggravated my father. He did not like that I would often take over his office computers after hours to play Star Craft and Age of Empires with my cousins. If only he knew how many times I broke his computers trying to get games to work on his network.
   Anyway, I noticed something funny about computers at the time. Companies that manufactured parts would make something really cool and super powerful. A week after it was released some software would come along that would push that hardware to the very edge of what it could do. A few more weeks and it was almost useless to the newest games on the market. Often game makers would build a way for end users to tone down the game so it would not eat the computer's resources. There were some games that took up to a year before parts makers could make stuff that could really do it justice.
   So technology marched on at a quick pace because software engineers, specifically game designers, wanted to do more and show their audience more. Hardware could barely keep up just trying to please the customers.
   The same thing is driving the market now. It's less pronounced because in some sense the hardware has caught up. Console game machines can now reproduce beautiful scenes within games. PCs have moved along to the point where a gaming laptops are highly affordable, I'm writing this on one I got for $750.
iPad are going to be more and more important to every day users. These all rely heavily on batteries. The whole idea behind these things is mobility. So technology has moved towards that in high gear. New chips are being designed to be very powerful and consume little power.
   The issue arises that batteries are not marching on as quickly. New technologies are being developed for batteries. Sadly they are mostly backed by universities that follow the hypothesis system. For instance, a new battery innovation was invented at Stanford about 3 years ago that would increase batteries charge time by about 8 times. The original scientist that made this new innovation said at the time it would take 5 years to get the innovation into production. As of today they are "expecting" a prototype of this battery for cell phones to be made available this year. Not a production unit, a prototype. 3 years to deliver a prototype. Why?
   Simply put, it is not in the best interest of scientist to deliver products. Many times scientists work for institutions that retain part or all of the patents developed by their employees. So innovations don't get put out to market because the scientist gets funded to do research not paid royalties for each battery made using his process. So scientists would often rather come up with hypotheses that take years and years to prove or disprove simply to have job security and long term funding.
   If Intel were researching this battery they would have had some kind of production unit with 12 months of announcing the idea of the new battery. If AMD were doing the same thing they might get something out in 9 months simply to beat Intel to the punch. If ARM holdings were behind it they would have drawn up plans for how to do it and would have sold licences to Texas Instruments, IBM and Nvidia and we would probably be seeing 8 versions out by Christmas time inside the best new cell phones.
   Would it make you sad to know that while Stanford is making a battery with 8 times the charge of current cell phone batteries there are other companies out there making sodium batteries that are even better? Some of these batteries have been in R&D for a decade. There is one substance that science loves to talk about called carbon nano tubes that could potentially improve current batteries by 100 to 1000 times. This stuff was discovered in the 90s. It's a favorite in the research world. What I don't get is why nobody is researching mass production of it. I mean there are a few out there that do but why isn't energizer or Intel or IBM doing whatever they can to mass produce this stuff. Not only does it have the potential to increase battery power but it also has the potential to decrease the power consumed by electronics by around 100 times.
   Imagine that for a minute and you might understand my distress. A cell phone that lasts 100 times longer on a single charge because it uses far less electricity to perform the same function plus a battery that is 100 to 1000 times smaller.

April 15, 2011

/epiphany

   The other day I was joking around with one of my bosses at work about computers. I should explain that I was originally hired to do lab work exclusively but as our study has evolved my job has as well. I now do front line IT work for my office. Basically I do all the easy stuff so our main IT person can focus on software development for the time being. Essentially that means I fix all the simple problems and handle all the hardware issues. He still deals with all of the high level technical issues.
   This has presented me with the second opportunity to work hands on in an environment mixed with Apple and Windows based machines. The first time being in college working for the student newspaper.
   Because I am predominantly a PC or Windows user I occasionally make a jab at Apple because my boss is an old school Mac user. At this particular occasion I was making a jab at the fact that another bosses computer didn't come equipped to run Flash. Flash, as you may know, is something Steve Jobs has gone well out of his way to discourage.
   The stupid part about the joke was that Flash is a third party internet browser plug in. For those who don't follow that simply means that essentially no computer system comes with Flash built in. The only exception to that rule is Chrome OS which as of now is still in development and not available for general consumption.
   Realizing my stupidity and feeling quite ashamed I hid away in my cave (I am one of the lucky few that does not work from a cubicle) and began to research operating systems. This is something that has been a very great interest to me lately since my recent dabble in Linux.
   Further explanation in my curiosity is that for a while now I had been wondering why Linux computers can see and alter files from another system. It seemed that if they were completely different systems they would not even be able to see the files from another computer system.
   A decade or two ago you could not place a disk formatted for windows in a Mac and expect the Mac to even see the files. Heck you were lucky if the system could recognize the media and ask for a format. Somewhere along the line that all changed. Now we don't even think when we put a flash drive in any system and the thing just works all the way around. I wanted to know why for a while.
   Fueled by my curiosity and embarrassment I went on the hunt. I found what I was looking for in Wikipedia in the form of historical information.
   So here is my conclusion and a summary of what I learned. Computers are pretty much all the same. There are some minor differences between computers but most of those are superficial. All computers we use today from our desktop to the phone in our pockets are based on the same basic structures. All of these computers are descended from the same ancestry of information technology. They all speak the same language at the core. They all contain the same building blocks. The differences come in how these common blocks are utilized.
   On the lowest level computers speak zeros and ones. For a long time people had to speak that code to program and use a computer. Eventually a shorthand was created to make this simpler. Then another shorthand was created and then another. These three shorthands were used on computers that would be unrecognizable as such to most of us.
   Over time a new version of shorthand was created to translate any of the others into something that resembled plain language and the first real programming languages were born. Many of these languages were designed to be used with a specific machine. Over time certain of these machines became more dominant and with them the languages.
   As computers became less expensive to manufacture and more powerful there came a time when companies decided that a command based interface would be even better as it would simplify the usage of the computer even more. It was at this time operating systems were born. An operating system is nothing more than a plain language interface for a computer. It means that you can get the computer to function on some level without even knowing how to program. The OS (as I will refer to it from here on out) is basically a layer of programming to separate the user from the machine languages that only experts learned so just about anyone could use a computer.
   What really surprised me was that in the very earliest of days the OS of most machines was designed for just that machine just like earlier many machine languages were designed for a specific machine. As I began to wonder about this it occurred to me that the reason for this must be that each machine has specific hardware that another machine would not have had or needed.
   A few years later and some clever people decided that these machine specific parts could be encoded in software called drivers. Then any OS could run on a particular machine as long as the drivers were available to the OS. Each driver would tell the OS how to communicate with the machines unique hardware setup.
   It was at this time that the computer industry really began to have a place in our homes. Some of us had an Apple others an IBM.
   Fast forward a few decades and we are back in the present. We have a handful of OS choices and they all can see the files made for and/or by another. Why is that? Well it's mostly because the origin is all the same. Apple's OS 1-9 ran a specific instruction system for drivers that were specific to any machine they made. Apple held onto control of their hardware and software to attempt to create a very simplified user experience that was meant to be hassle free. They were more or less successful. Microsoft used their various systems to create a broad view that would accept as much hardware as possible in just about any configuration. At the core they were 2 doing the same tasks.
   Meanwhile there were a few other systems that most consumers were not aware of that are equally important. Unix and Linux were being developed for larger computer servers all these years. Unix is an OS designed to be extremely powerful and extremely stable and probably most importantly it was designed to manage a very large amount of resources that typical home computers would never have. Linux was a project that was designed to emulate Unix but be free and more importantly something that anyone could adapt and change.
   One other system was being used with similar properties at the time. It was Windows NT. NT was designed originally for business users to give them a stable system capable of managing a lot of resources. With the release of Windows XP professional and onwards this is the only system Microsoft releases.
   Some time in the last decade a few major shifts occurred and most of us were not aware of them happening. Apple dumped their original OS for a system based on Unix. They created a hybrid of a few versions of Unix systems and released it as OS X. At the time I thought the X stood for the number 10. Now I know it, like every version or Unix, uses an X to denote the Unix origin.
   So, here we are in 2011 with a handful of OS choices. Windows 7 is a version of NT which in turn is a network intended system. OS X is at its core a server software platform. And Android which is increasingly popular for phones is based on the core components of Linux which is also designed for servers. All of our choices are essentially the same at the core in what they do and how they use the actual hardware.
   So, where does that leave those of us who are consumers? If all our choices are the same then what choice do we have? The more I think about these kinds of questions the more I realize that today's computer user is in a position that is new and wonderful in the history of computers. It means that it doesn't really matter what we buy. Brand names are irrelevant now. All of these computers are now capable of interacting with each other. They are capable of sharing information and more importantly our collective ideas and understanding.
   We live in a computer market where manufacturers have to compete for our dollars by providing us with value in areas that are actually important. We can buy the machine that best suits our own personal needs and wants and not be so concerned about whether or not it will be compatible with our office printer or network. We can be less concerned about how something works and more concerned with our own work and our own play.
   While it is still true that not all software will run on every device this old concept is slowly dying. Software developers are realizing more and more that money is made in making software work everywhere and on everything. Other companies are working on ways to make software run on the server instead of the local machine. This gives anyone with an internet connection the ability to use their software from any computer or device. Google Docs is a great example of a really solid Office like software platform that works anywhere from a connected computer and in some places without regardless of what type of computer it is.
   Next time you ask me what kind of computer I would recommend you buy ask yourself a few questions first: What software do I want to run on my new computer? Will it run on another type? Would that other type/brand/model fit my needs equally as well? What kind of money do I want to spend for a computer? Am I mainly going to be doing stuff on the internet? Am I going to play games on it? These kinds of questions are fundamental to your next computer purchase.
   Some simple guidance. An Apple is going to generally cost a bit more than a Dell or an Acer. Some Acer computers are better for gaming than a Mac by virtue of a better video card built into many of them. An iPad has a lot of apps that you can only get on an iPad for now. In a year all of the most popular third party apps will be available on Android and a few months later on Windows (example Angry Birds). One thing you can only get on a Mac is the Macbook Air. Essentially it's the fastest computer around and comes in at a super light weight and highly efficient system. But in a year every other manufacturer will copy this.
   So, the final point here is that when it comes to computers look to features that you  want and that you need. Not at the ones Justin Long or Jerry Seinfeld mention in a TV ad. What do either of them know about you and your needs anyway?

April 5, 2011

Linux... Finally

   As an avid computer user and self-described tech geek/gadget fan I don't know why I didn't get to this earlier. But I have finally installed and run Linux on my computer. As a beginner I decided to go with Linux Mint.
   Because I have been a long time Windows user and didn't really want to reinstall everything I have on my laptop should things go wrong I decided to go with a USB drive version (Trend Micro thinks this site is dangerous though it was not so the first time I visited). Basically what this means is that I have installed Linux Mint on my 8GB flash drive and when I want to run it I can use the boot menu to load it instead of Windows.
   The initial installation took probably 15 minutes. That included downloading and installing on a new flash drive that was already empty. Then it took me a couple of minutes to enable the boot menu option on my laptop.
   Running Linux Mint the first time was kind of strange. Overall it looks quite a bit like Windows. It is essentially a simple point and click interface like we are all used to. I would say that it just doesn't quite have the polish that Windows has, though if you are used to running Windows XP it may be a step up.
   There is what is essentially a "Start" button in the lower left hand corner that will open up a menu where all of the preloaded programs are located. It comes with Open Office as well as Mozilla Firefox and a handful of other programs that essentially equate with things within Windows.
   Being a Chrome user mainly I immediately went and downloaded Chrome for Linux and installed it. I used the version Google packaged for Ubuntu as I was told Mint and Ubuntu are very similar. It was really easy to install. Once it was downloaded I simply had to click the installation package and it ran an installer program. A minute later I was importing my bookmarks into Chrome and browsing the web.
   And that ends the simplicity of using Linux. Everything I have attempted to do since installing Chrome has been something of a headache.
   The next thing I tried to install was an anti-virus program. Linux users the world over will swear that you never need an anti-virus program in Linux. This fact I have seen debated in many forums. As I am not qualified to say I would just assume that no computer system is free from security issues. However, for my purposes I was hoping to install the software to scan Windows based systems from a safe environment. The idea being that if my laptop or a computer I support gets a serious infection I could boot to Mint and run a check on the drive(s) and clean the system.
   I found a number of anti-virus programs designed to run within Linux that were advertised to scan Windows drives. Once I found one I was shocked at how difficult the system was to install. It required opening the command line terminal. Many of you may remember when computers ran DOS. You had to type commands into a black screen and hope that you typed them right in order for the computer to do anything. If you were studious you may have learned a lot of commands. You may remember some of them still. But in today's world that stuff is mostly forgotten and in many cases not needed. In the Linux world it is still how everything is done. The graphical user interface is still something of a facade.
   To end this part of the story I'll just say that I still have not figured out how to install this anti-virus software or any other for that matter. Perhaps one day I will get online to learn the commands required. Yes it requires more than one command to install.
   The next hurdle in my Linux adventure was when I tried to connect to my home internet using the wireless card in my laptop. I suppose Windows has made me lazy. I remember in the DOS days that all hardware came with driver disks. Even whole computers would come with disks containing drivers for preinstalled hardware. Then one day Microsoft decided that it would create a vast database of drivers and make it's operating system automatically detect new hardware and at least attempt to install it from its database.
   Linux has some drivers built in. I have to say that the Ethernet port worked without installing additional drivers. The wireless card would not. I looked on the internet for some kind of fix. From what I can see so far the only possible way to make it work is to get a report from my system, upload it to an experts forum then wait and hope that the card is even supportable.
   Now all this wouldn't be so bad, especially considering that Linux is free. However, it is difficult for me to live with these difficulties when many Linux advocates love to say things like, Linux can do anything Windows can do.
   Can I play Starcraft on Linux natively? Can I run MS Office? No. In fact just browsing the web wirelessly has proven difficult to say the least.
   I recently have run across statements on the internet
   The point is my parents want to get on their computer and do something. They do anything from playing games to working on genealogy. The point is they don't have the patience to deal with command lines and I don't have the time to be doing it for them.
   With Linux it may be possible to do anything one can do with Windows. But at this point in development Linux is far more confusing that most other computer systems out there. As computers get more powerful and more compact the users of computers are looking for simplicity and convenience. Computers should be a way for the average user to get more things done not a way to find more things that need getting done. From where I sit the only Linux based system that is even capable of this at this time is Google's Linux based Android, which ultimately only uses some Linux guts.
   For now the world of computing for the masses still belongs to Microsoft Windows, Apple's OS X/iOS and Google's Android. My advice, don't get your parents or grandparents anything else.