I just took a quick look at the statistics for the OpenMoko project and it made me happy.
Currently there is 3500 revisions in the svn repository and bugzilla has reached 1022 bugs, 241 of them are still open. The wiki has 3685 pages and 6090 registered users, and maybe more impressively, projects has 1507 developers and 81 projects registered.
The IRC channel on freenode has stayed at around 320 people for the last six months and there's a lot of interesting discussions going on. It's a friendly atmosphere and people interested in the project are encouraged to drop by. Usually, they get pointed to the right place in the wiki, where most questions have already been answered. A good page to read if you want to know what is going on in the project is the Community Updates page. Right now it has not been updated in over a week, but I'm sure it will be soon.
Unfortunately, a hardware bug in power management with GTA02v4 has required a fifth revision of the hardware for GTA02. It's a bit of a shame that the release will be delayed even further, but I'd rather have good hardware later, than something broken right now.
The most interesting code changes in OpenMoko recently involves gsmd, PhoneKit(pdf) and the dbus interface to it. The dialer and sms handling is also being worked at in a furious pace by the OpenedHand guys. Since about a week, calling has worked without any problems for me. Power management has been improved and the phone should last at least a day now.
Also, some guys from Ixonos have been working on an alternative to gsmd, gsmd2. I haven't looked at the code, but they have some very nice documentation and a detailed specification. Still, lots to be discussed it seems.
The GPS binary driver is still not available for download. OpenMoko have been promised by Global Locate that they will be able to distribute it, but the legal terms are not yet set. Hopefully this will be solved soon.
All in all, a lot is going on and the software will be in pretty good shape for when the GTA02 is released.
2007-11-24
2007-11-21
Our future climate concerns me
A few days ago IPCC (United Nations Intergovernmental Panel on Climate Change) released a document they "succinctly" call Policymakers' Summary of the Synthesis Report of the United Nations Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment. It's a complete summary of all the data on climate change, densely packed into 23 fact filled pages. (The full reports are much longer.)
The summary quickly showed up everywhere in the news, even at my favorite Ars. Reading through the document does make you think about the future of our climate. 11 of the last 12 years rank among the 12 warmest years ever measured since they started in 1850. For me personally, it has been some great warm and long summers and not a negative thing. For people living in northern Europe like I do, global warming will mostly make things better. The forests and crops will grow better, warmer summers, less freezing winters. The only downside for us will be an increase in precipitation. Most places will not be this lucky.
Although the temperature will increase the most at the poles, it's the places that are already dry and hot that will get the most serious problems. Serious droughts will follow, an increase in wild fires and agriculture and livestock will suffer. That might eventually lead to malnutrition and on top of that, clean drinking water will become a problem. The number of cyclones and storms is likely to increase too, and in low coastal regions and river deltas, increased risk of flooding.
The study also expects that in the long term, if the warming continues, the ice caps on Greenland will melt completely and raise the sea level with about 7 meters. This will take some thousand years or so, but it will be a noticeable increase just in the next 100 years.
The primal cause for the emission of green house gases (GHG) is the use of fossil fuels. The concentration of carbon dioxide and methane in the atmosphere is exceeding by far the natural range seen in the last 650,000 years. We need to cut down on the green house gas emissions on a global scale right now, and even using the most optimistic scenarios still points to an increase in global temperature.
A global increase of 1.5-2.5 degrees Centigrade would endanger 20-30% of the species assessed of global extinction. Most scenarios in the summary suggests a much higher increase in temperature...
If all future investments in infrastructure and energy plants are shifted to get the lowest possible CO2 emissions, the additional investment costs would be around 5-10% higher. That's not much at all, and simply increasing efficiency of energy supply and industrial processes would do a lot to stabilize GHG emissions on a global scale. It's good to see that UK is helping China to get started on this.
Maybe we have no choice in lowering our oil consumption. A recent article in Wired states that most likely we will be unable to maintain the current consumption because we just can't pump the oil up fast enough. Some says 10 years more is all we have.
My personal opinion is that oil based fuels are way too cheap. If prices were at least doubled, maybe driving around in a petrol car won't be the cheapest way to travel medium distances any more. Electric or hydrogen fuel cell cars, although still very expensive, would become a more viable option. Flying is also cheaper than it should be, and even though it's nice to be able to afford to fly away for vacation, I wish there was a less polluting option for long trips. I really do.
The summary quickly showed up everywhere in the news, even at my favorite Ars. Reading through the document does make you think about the future of our climate. 11 of the last 12 years rank among the 12 warmest years ever measured since they started in 1850. For me personally, it has been some great warm and long summers and not a negative thing. For people living in northern Europe like I do, global warming will mostly make things better. The forests and crops will grow better, warmer summers, less freezing winters. The only downside for us will be an increase in precipitation. Most places will not be this lucky.
Although the temperature will increase the most at the poles, it's the places that are already dry and hot that will get the most serious problems. Serious droughts will follow, an increase in wild fires and agriculture and livestock will suffer. That might eventually lead to malnutrition and on top of that, clean drinking water will become a problem. The number of cyclones and storms is likely to increase too, and in low coastal regions and river deltas, increased risk of flooding.
Global increase in temperature for 2099 compared to 1999
The study also expects that in the long term, if the warming continues, the ice caps on Greenland will melt completely and raise the sea level with about 7 meters. This will take some thousand years or so, but it will be a noticeable increase just in the next 100 years.
The primal cause for the emission of green house gases (GHG) is the use of fossil fuels. The concentration of carbon dioxide and methane in the atmosphere is exceeding by far the natural range seen in the last 650,000 years. We need to cut down on the green house gas emissions on a global scale right now, and even using the most optimistic scenarios still points to an increase in global temperature.
A global increase of 1.5-2.5 degrees Centigrade would endanger 20-30% of the species assessed of global extinction. Most scenarios in the summary suggests a much higher increase in temperature...
If all future investments in infrastructure and energy plants are shifted to get the lowest possible CO2 emissions, the additional investment costs would be around 5-10% higher. That's not much at all, and simply increasing efficiency of energy supply and industrial processes would do a lot to stabilize GHG emissions on a global scale. It's good to see that UK is helping China to get started on this.
Maybe we have no choice in lowering our oil consumption. A recent article in Wired states that most likely we will be unable to maintain the current consumption because we just can't pump the oil up fast enough. Some says 10 years more is all we have.
My personal opinion is that oil based fuels are way too cheap. If prices were at least doubled, maybe driving around in a petrol car won't be the cheapest way to travel medium distances any more. Electric or hydrogen fuel cell cars, although still very expensive, would become a more viable option. Flying is also cheaper than it should be, and even though it's nice to be able to afford to fly away for vacation, I wish there was a less polluting option for long trips. I really do.
2007-11-10
Memory and learning
I've been thinking a lot about learning and memory lately and there's been no shortage of interesting articles to read or videos to watch.
Over at wondr.net Jamin has started his Memory Month. He posts a new article every day, teaching you useful tricks for remembering everything from shopping lists and numbers to names of people you've just met. These methods will probably work great, but I'm to lazy to really sit down and do it since there's so many other things distracting me right now.
I don't believe in the old myth that humans only use 10% of the brain, but I do believe that there's a lot about the brain we don't understand at all yet, and maybe we never will. At Google Video they have some very interesting documentaries regarding the brain, for instance this documentary from Channel 5 in 2005 about Daniel Tammet or this one about Kim Peek. These savants have enormous counting skills huge memory capacities, but the brain power comes at a price. From mild autistic tendencies like Asperger's to completely anti-social autistic behavior.
It's been noted that these types of disorders are getting more and more common, especially in academic families and it's even been called The Geek Syndrome by Wired. The latest studies says that something between 3 and 20 genes are involved in causing these disorders. Incidentally this also seems to affect maths and science skills in a positive way. Abnormalities in the cerebellum, the "little brain" responsible for motor control and filtering sensory input and passing it on to the right part of the brain, is common in autistic persons. Kim Peek for instance, has a damaged cerebellum and no corpus callosum, the connection between the two halves of the brain, at all.
The brain is a pattern matching machine and it seems to me like it's automatically filtering the continuous flow of information washing over us, but if it's not filtered enough the person might be classified as slightly autistic. If the brain filters too much, parts of it just doesn't get used enough and will dwindle away. Of course this is an overly simple way of looking at it, but I'm certain it's a part of the puzzle.
I'm also convinced that the brain is a lot more flexible than people used to believe, even in grown ups. New born babies have twice the number of nerve cells they need, but the ones that aren't used just dies off. Usually this happens in two periods, first at a young age, then again around puberty. But the brain is not "frozen" after that at all. A recent study on mice shows that the nerve cells move around and stretch "in a highly dynamic fashion".
It's never too late to learn something new, but I think it's harder to really focus on learning just one thing when you're grown up compared to when you're still a kid. There's just too many things to think about and too many distractions, and so much interesting to learn.
Wish I didn't have to sleep so much, but a completely unscientific experiment on myself made me notice that when I sleep less then 6 hours per night, my memory isn't as good as it usually is. Things just don't stick. Also, I've noticed that I remember things I read in the evening better than the things I read in the morning. I imagine that my brain is working through the input from the day while sleeping, trying to keep only the seemingly important memories.
It's getting late, I'll go read a book.
Update: Great article about memory in National Geographic
Over at wondr.net Jamin has started his Memory Month. He posts a new article every day, teaching you useful tricks for remembering everything from shopping lists and numbers to names of people you've just met. These methods will probably work great, but I'm to lazy to really sit down and do it since there's so many other things distracting me right now.
I don't believe in the old myth that humans only use 10% of the brain, but I do believe that there's a lot about the brain we don't understand at all yet, and maybe we never will. At Google Video they have some very interesting documentaries regarding the brain, for instance this documentary from Channel 5 in 2005 about Daniel Tammet or this one about Kim Peek. These savants have enormous counting skills huge memory capacities, but the brain power comes at a price. From mild autistic tendencies like Asperger's to completely anti-social autistic behavior.
It's been noted that these types of disorders are getting more and more common, especially in academic families and it's even been called The Geek Syndrome by Wired. The latest studies says that something between 3 and 20 genes are involved in causing these disorders. Incidentally this also seems to affect maths and science skills in a positive way. Abnormalities in the cerebellum, the "little brain" responsible for motor control and filtering sensory input and passing it on to the right part of the brain, is common in autistic persons. Kim Peek for instance, has a damaged cerebellum and no corpus callosum, the connection between the two halves of the brain, at all.
The brain is a pattern matching machine and it seems to me like it's automatically filtering the continuous flow of information washing over us, but if it's not filtered enough the person might be classified as slightly autistic. If the brain filters too much, parts of it just doesn't get used enough and will dwindle away. Of course this is an overly simple way of looking at it, but I'm certain it's a part of the puzzle.
I'm also convinced that the brain is a lot more flexible than people used to believe, even in grown ups. New born babies have twice the number of nerve cells they need, but the ones that aren't used just dies off. Usually this happens in two periods, first at a young age, then again around puberty. But the brain is not "frozen" after that at all. A recent study on mice shows that the nerve cells move around and stretch "in a highly dynamic fashion".
It's never too late to learn something new, but I think it's harder to really focus on learning just one thing when you're grown up compared to when you're still a kid. There's just too many things to think about and too many distractions, and so much interesting to learn.
Wish I didn't have to sleep so much, but a completely unscientific experiment on myself made me notice that when I sleep less then 6 hours per night, my memory isn't as good as it usually is. Things just don't stick. Also, I've noticed that I remember things I read in the evening better than the things I read in the morning. I imagine that my brain is working through the input from the day while sleeping, trying to keep only the seemingly important memories.
It's getting late, I'll go read a book.
Update: Great article about memory in National Geographic
2007-10-18
OpenMoko progress
I have had my Neo1973 for three months now and the software is finally getting usable. There were a lot of changes during August, the totally new theme and a lot of updates to the kernel and low level libraries. I had the first successful "out of the box" call with OpenMoko using a build from ScaredyCat in September 3:rd, but after that things started to go bad. The daemon handling communication to the GSM modem was more or less broken for a month and calling did not work out of the box for quite a while. That was if you even managed to get the rootfs built. WebKitGtk failed to build almost every time, and so did many other programs too.
This happened for many reasons, mostly because the understaffed development team had to focus on the hardware for GTA02 to iron out all serious bugs, and also because some of them finally had some well deserved vacation.
Despite these issues, the first batch of Neo1973's sold out quickly. A second batch was produced during late September and more people got their hands on the GTA01 phones in early October.
Trolltech also published an image for the Neo1973 and it worked really well. Even managed to use up the rest of my prepaid sim card calling friends.
Since then things have really sped up again. OpenMoko hired XorA to manage OpenMoko in OpenEmbedded and take care of build issues and bitbake recipies. This quickly made a big difference when it comes to getting the complete OpenMoko distro to build. Also, WebKitGtk now compiles more often than not and the default build has seen a several additions. It now includes the openmoko-browser2 based on WebKitGTK. The browser is a bit unstable still and the design is awful since only about 50% of the screen is used to display the actual web page you're browsing.

Also the media player has gotten a face lift and is becomming usable, with the exception that the mp3 decoding library is not yet optimized for the Neo. There's some usability issues still, like that the volume slider is difficult to control with your fingers only, and how you add files to the playlists.

The manufacturing of the second hardware revision of the Neo1973 is just about to start and hopefully all eager developers can get hold of a GTA02 before Christmas. If I didn't have a GTA01 already I would probably wait for the GTA02. The WiFi and faster processor would surely be nice to have. Of course bigger flash disk, the accelerometers and the graphics accelerator are nice too. In addition to that, the AGPS chip has changed to U-blox. (For a comparison of the two revisions of the phone, look here.)
Talking about that, I really hope that Broadcom will release an EABI driver for the Global Locate chip in the GTA01. Since Broadcom bought Global Locate it's been awfully quite about the GPS and even if you can get the old driver working in a chroot, it's a shame it takes so much effort. There's also no applications what so ever for using the GPS either. Not beyond a few shell scripts at least.
This happened for many reasons, mostly because the understaffed development team had to focus on the hardware for GTA02 to iron out all serious bugs, and also because some of them finally had some well deserved vacation.
Despite these issues, the first batch of Neo1973's sold out quickly. A second batch was produced during late September and more people got their hands on the GTA01 phones in early October.
Trolltech also published an image for the Neo1973 and it worked really well. Even managed to use up the rest of my prepaid sim card calling friends.
Since then things have really sped up again. OpenMoko hired XorA to manage OpenMoko in OpenEmbedded and take care of build issues and bitbake recipies. This quickly made a big difference when it comes to getting the complete OpenMoko distro to build. Also, WebKitGtk now compiles more often than not and the default build has seen a several additions. It now includes the openmoko-browser2 based on WebKitGTK. The browser is a bit unstable still and the design is awful since only about 50% of the screen is used to display the actual web page you're browsing.

Also the media player has gotten a face lift and is becomming usable, with the exception that the mp3 decoding library is not yet optimized for the Neo. There's some usability issues still, like that the volume slider is difficult to control with your fingers only, and how you add files to the playlists.

The manufacturing of the second hardware revision of the Neo1973 is just about to start and hopefully all eager developers can get hold of a GTA02 before Christmas. If I didn't have a GTA01 already I would probably wait for the GTA02. The WiFi and faster processor would surely be nice to have. Of course bigger flash disk, the accelerometers and the graphics accelerator are nice too. In addition to that, the AGPS chip has changed to U-blox. (For a comparison of the two revisions of the phone, look here.)
Talking about that, I really hope that Broadcom will release an EABI driver for the Global Locate chip in the GTA01. Since Broadcom bought Global Locate it's been awfully quite about the GPS and even if you can get the old driver working in a chroot, it's a shame it takes so much effort. There's also no applications what so ever for using the GPS either. Not beyond a few shell scripts at least.
2007-09-19
Science and Nature Writing
I just finished reading The best American Science and Nature writing 2006. This is the seventh version of this book series and I definitely wish I had found out about it earlier. Each book is made up of about 25 articles that a guest editor selects from about a hundred articles chosen by Tim Folger.
The 2006 edition is edited by Brian Greene who just happens to be one of my favorite physics writers. The articles span everything from bittorrent and blogs to animal psychology, anthropology and advanced physics. Coming from sources like Scientific American, The New York Times and Wired, most articles are easy to read and don't get detailed to the point of being boring. I had actually managed to read some of the articles already, but the others were a great way for me to be introduced to areas of science I knew very little about.
Some of my favorite articles are Dr Ecstasy about the chemist Alexander Shulgin, His brain, her brain about the differences between the male and female brain, and The coming death shortage describing how the increasing life expectancy of humans will affect us.
I encourage everyone interested in widening their scientific horizons to read this book. It's only 280 pages long and reading one chapter per night before falling asleep was perfect. 9 out of 10 points for this book from me.
The 2006 edition is edited by Brian Greene who just happens to be one of my favorite physics writers. The articles span everything from bittorrent and blogs to animal psychology, anthropology and advanced physics. Coming from sources like Scientific American, The New York Times and Wired, most articles are easy to read and don't get detailed to the point of being boring. I had actually managed to read some of the articles already, but the others were a great way for me to be introduced to areas of science I knew very little about.
Some of my favorite articles are Dr Ecstasy about the chemist Alexander Shulgin, His brain, her brain about the differences between the male and female brain, and The coming death shortage describing how the increasing life expectancy of humans will affect us.
I encourage everyone interested in widening their scientific horizons to read this book. It's only 280 pages long and reading one chapter per night before falling asleep was perfect. 9 out of 10 points for this book from me.
2007-09-06
NerdTests.com
I remember doing this nerd test in 2005 and got the following result:
Noticed on a blog that they had a new version of the test and of course I had to take it. The questions are funny and after answering them as honestly as I could I got:
Hmm... Not sure what to say about the score, but at least I'm not a dumb awkward dork. :-)
Noticed on a blog that they had a new version of the test and of course I had to take it. The questions are funny and after answering them as honestly as I could I got:
Hmm... Not sure what to say about the score, but at least I'm not a dumb awkward dork. :-)
2007-09-03
Vala and Vim
I'm a Gnome user and like reading Planet Gnome (In Google Reader of course!) to see what is going on. There's been quite some buzz around Vala lately so of course I had to check it out. Having used Java and C# quite a lot I've learned to like the syntax. Vala is still in early development, but it's improving quickly and already works well enough to play with.
To make the code look better in vim, add this to your vimrc file. I'm using Gentoo so I put it in /etc/vim/vimrc.local (Ignore the numbers, they are needed because Blogger sucks when you try to show code.)
To make the code look better in vim, add this to your vimrc file. I'm using Gentoo so I put it in /etc/vim/vimrc.local (Ignore the numbers, they are needed because Blogger sucks when you try to show code.)
- augroup vala
- au!
- au! BufRead,BufNewFile *.vala set filetype=vala
- au! Syntax vala source /usr/share/vim/vim71/syntax/cs.vim
- augroup END
2007-09-01
X86 processors and chipsets
It's not easy to stay on top of all the latest developments in the CPU world but luckily you hardly have to any more. All new computers are fast enough for most users, unless you absolutely want to play the latest games. But for the ones that want to know I've put together a summary of what has been going on during the last few years, what happens right now, and some rumors of the future.
There is still a lot happening on the CPU front, even if it might feel like the increase in performance is not as fast paced as it used to be a few years ago. The most notable change lately is that almost all new processors have two or more cores on each chip, but let's start from the beginning shall we?
NetBurst and K7
Everyone even remotely interested in computer hardware know that NetBurst, the architecture used in the Pentium 4 processor, was far from elegant. Released in late 2000, it was built purely for high clock frequencies. The first NetBurst CPU had a 20 stage pipeline, compared to the 10 stages in the Pentium III and the AMD K7 Athlons. The long pipeline made it easier for Intel to push the clock frequency way up and soon enough the old Athlons were falling behind. AMD was simply unable to increase the clock frequencies any more with their current design and manufacturing processes.
K8
When AMD released their K8 based Opteron processors in the autumn 2003 the picture changed completely. The new CPUs completely left the NetBurst Pentium 4's in the dust. They had a completely different design, using a short 12 stage pipeline, compared to the 20 stages long pipeline used in the Northwood Pentium 4, and doing a lot more Instructions Per Clock (IPC). The K8 architecture could perform three complex x86 instructions per clock, compared to one for the Pentium 4. It had three integer and three floating point ALUs, compared to four integer and one floating point ALU for the contemporary NetBurst revision (More details here).
The Opteron also added AMD's x86-64 instruction set, making it possible to use more than 4GB ram without any inefficient work-around. The extension also included doubling the number of registers from 8 to 16, as well as doubling the size of the registers from 32 to 64 bit. Combined with a modern and efficient HyperTransport bus and a low latency on-die memory controller there was just no way Intel could compete. They did their best, using their more advanced manufacturing process to add more cache to the CPU die and increased the pipeline length even further. The seventh revision of the NetBurst architecture called Prescott had 31 stages in the pipeline, but it just wasn't enough. Yet, all the Pentium 4 did was to get smoking hot.
Pentium-M and Core
Luckily for Intel, their engineers in Israel had been working on a new power efficient processor for use in laptops and came up with a real gem. Called a Pentium III on kryptonite, the Banias was built to keep power dissipation down. Released 2003 in laptops under the Centrino brand, enthusiasts quickly noticed they could overclock it to perform better than any desktop Pentium 4, and even better than the latest Athlons. Intel gave up on NetBurst and continued to develop the Pentium-M into what would be come the Core architecture.
The first Core based CPU, code named Yonah, was released in laptops as the Core Duo in January 2006. They performed way better than AMD's laptop offer, the Turion CPUs, who were both slower and consumed more power. Later in the spring Intel brought the Core architecture to the desktop with the Conroe processor. It was released under the name Core 2 Duo and was an instant success. The chips were about 33% faster than the Athlon 64's at the same clock frequency and they also included AMD's x86-64 extensions.
AMD tried to answer up to Intels offering, but now they were once again the having the inferior architecture. And just as before, they were unable to push their processors to the frequencies needed to compete with the Core 2 CPUs on the competitive desktop market. Debuting at 2.2 GHz in 2003, the fastest Athlon 64 FX was still stuck at 2.8 GHz in mid 2006 while the Core 2 Duo with twice the cache was already available at 2.933 GHz.
Fortunately it was still going well for AMD in the lucrative 4+ CPU server space. Intel's Core 2 Xeon server chips were still stuck with an old Front Side Bus (FSB) and were unable to compete on four-sockets or more. AMD, now with the knife at their throat struggled to get Barcelona, the processor based on their new K10 architecture, ready for consumers.
Future - K10 and vPro, Bulldozer and Nehalem
The long awaited Barcelona quad core processor will be released in mid September this year. AMD are still keeping a tight lid on all the details surrounding the chip, but some leaked benchmarks from The Inquirer sure look promising. The results show that the Barcelona could be pushed to over 30,000 3DMarks 06 and 11GB/s memory bandwidth compared to Intel's fastest "quad core" having 7.5 GB/s bandwidth. Architecturally they seem to be equals, but the question is if AMD will be able to compete for the top spot when they're always a step behind Intel in the manufacturing process.
To steal some thunder from AMD, Intel will release a new platform called vPro at the same time. First presented in 2002 under the name LaGrande, together with Microsofts controversial Palladium initiative, these extensions are now known as Trusted Execution Technology (TXT). Intel have announced three new Core 2 Duo processors for use with the vPro platform. Intel is trying to push it as the ultimate virtualization technology, which it is in some ways, but it can also be used as a very powerful DRM. Hopefully people will realize what this means and how "Big Content" can use it to lock you out from using your files as you want to. Hannibal describes his concerns well in his Ars Technica article about vPro.
Next year Intel will start releasing systems using their new Common System Interface (CSI), finally replacing the old FSB that's been a bottle neck for way too long. Initially it will be intended for multi socket servers, but the technology will trickle down to desktops and laptops later on.
For AMD the future seem to be about Bulldozer. It will be the first CPU with SSE5 and will be build with AMD's new modular Fusion architecture. The idea is to be able to mix several types of cores on the same die, including on-die Graphics Processing Units (GPU). To be released in 2009, they will be up against Intel's next chip architecture called Nehalem.
Tip of the iceberg
I've intentionally left the EPIC, SPARC, CELL and POWER architectures out of this. They might be technically more interesting, but they're not x86. Maybe that's something I should get back to another day, if only to make fun of the Itanic or to admire IBM's POWER6 beast. IBM are producing the CPUs for all three new consoles, the Wii, the XBox360, and the PS 3, and there's a lot to say about all of them. I've also avoided talking about what's happening in the rapidly growing System On a Chip world. Mobile computing is exploding and there's many chip manufacturers who want a piece of the action.
I've listed several well written articles at the end of this post if you really want to dig in to details like vector processing instructions, memory bandwidth or the different manufacturing processes, they are all worth reading.
My favorite tech writer, Jon "Hannibal" Stokes have recently released a book called "Inside the Machine". I'm ashamed to say I haven't read the book yet, but it's definitely something I will do as soon as I can. Expect a review of it.
While reading forums and comments to articles about processor architectures I've come up with this modified version of Goodwin's Law, we can call it Mogren's Law:
Into the Core: Intel's next-generation microarchitecture - Jon Stokes - April 05, 2006
Inside Barcelona: AMD's Next Generation - David Kanter - March 16, 2007
Intel Core 2 Duo - Franck Delattre and Marc Prieur - June 22, 2006
Core 2 Duo and the future of Intel - Rob Hallock - November 5, 2006
AMD K10 Micro-Architecture - Yury Malich - August 17, 2007
The Common System Interface: Intel's Future Interconnect - David Kanter - August 28, 2007
Intel's new vPro: two steps forward for x86... as well as for DRM and P2P? - Jon Stokes - August 27, 2007
There is still a lot happening on the CPU front, even if it might feel like the increase in performance is not as fast paced as it used to be a few years ago. The most notable change lately is that almost all new processors have two or more cores on each chip, but let's start from the beginning shall we?
NetBurst and K7
Everyone even remotely interested in computer hardware know that NetBurst, the architecture used in the Pentium 4 processor, was far from elegant. Released in late 2000, it was built purely for high clock frequencies. The first NetBurst CPU had a 20 stage pipeline, compared to the 10 stages in the Pentium III and the AMD K7 Athlons. The long pipeline made it easier for Intel to push the clock frequency way up and soon enough the old Athlons were falling behind. AMD was simply unable to increase the clock frequencies any more with their current design and manufacturing processes.
K8
When AMD released their K8 based Opteron processors in the autumn 2003 the picture changed completely. The new CPUs completely left the NetBurst Pentium 4's in the dust. They had a completely different design, using a short 12 stage pipeline, compared to the 20 stages long pipeline used in the Northwood Pentium 4, and doing a lot more Instructions Per Clock (IPC). The K8 architecture could perform three complex x86 instructions per clock, compared to one for the Pentium 4. It had three integer and three floating point ALUs, compared to four integer and one floating point ALU for the contemporary NetBurst revision (More details here).
The Opteron also added AMD's x86-64 instruction set, making it possible to use more than 4GB ram without any inefficient work-around. The extension also included doubling the number of registers from 8 to 16, as well as doubling the size of the registers from 32 to 64 bit. Combined with a modern and efficient HyperTransport bus and a low latency on-die memory controller there was just no way Intel could compete. They did their best, using their more advanced manufacturing process to add more cache to the CPU die and increased the pipeline length even further. The seventh revision of the NetBurst architecture called Prescott had 31 stages in the pipeline, but it just wasn't enough. Yet, all the Pentium 4 did was to get smoking hot.
Pentium-M and Core
Luckily for Intel, their engineers in Israel had been working on a new power efficient processor for use in laptops and came up with a real gem. Called a Pentium III on kryptonite, the Banias was built to keep power dissipation down. Released 2003 in laptops under the Centrino brand, enthusiasts quickly noticed they could overclock it to perform better than any desktop Pentium 4, and even better than the latest Athlons. Intel gave up on NetBurst and continued to develop the Pentium-M into what would be come the Core architecture.
The first Core based CPU, code named Yonah, was released in laptops as the Core Duo in January 2006. They performed way better than AMD's laptop offer, the Turion CPUs, who were both slower and consumed more power. Later in the spring Intel brought the Core architecture to the desktop with the Conroe processor. It was released under the name Core 2 Duo and was an instant success. The chips were about 33% faster than the Athlon 64's at the same clock frequency and they also included AMD's x86-64 extensions.
AMD tried to answer up to Intels offering, but now they were once again the having the inferior architecture. And just as before, they were unable to push their processors to the frequencies needed to compete with the Core 2 CPUs on the competitive desktop market. Debuting at 2.2 GHz in 2003, the fastest Athlon 64 FX was still stuck at 2.8 GHz in mid 2006 while the Core 2 Duo with twice the cache was already available at 2.933 GHz.
Fortunately it was still going well for AMD in the lucrative 4+ CPU server space. Intel's Core 2 Xeon server chips were still stuck with an old Front Side Bus (FSB) and were unable to compete on four-sockets or more. AMD, now with the knife at their throat struggled to get Barcelona, the processor based on their new K10 architecture, ready for consumers.
Future - K10 and vPro, Bulldozer and Nehalem
The long awaited Barcelona quad core processor will be released in mid September this year. AMD are still keeping a tight lid on all the details surrounding the chip, but some leaked benchmarks from The Inquirer sure look promising. The results show that the Barcelona could be pushed to over 30,000 3DMarks 06 and 11GB/s memory bandwidth compared to Intel's fastest "quad core" having 7.5 GB/s bandwidth. Architecturally they seem to be equals, but the question is if AMD will be able to compete for the top spot when they're always a step behind Intel in the manufacturing process.
To steal some thunder from AMD, Intel will release a new platform called vPro at the same time. First presented in 2002 under the name LaGrande, together with Microsofts controversial Palladium initiative, these extensions are now known as Trusted Execution Technology (TXT). Intel have announced three new Core 2 Duo processors for use with the vPro platform. Intel is trying to push it as the ultimate virtualization technology, which it is in some ways, but it can also be used as a very powerful DRM. Hopefully people will realize what this means and how "Big Content" can use it to lock you out from using your files as you want to. Hannibal describes his concerns well in his Ars Technica article about vPro.
Next year Intel will start releasing systems using their new Common System Interface (CSI), finally replacing the old FSB that's been a bottle neck for way too long. Initially it will be intended for multi socket servers, but the technology will trickle down to desktops and laptops later on.
For AMD the future seem to be about Bulldozer. It will be the first CPU with SSE5 and will be build with AMD's new modular Fusion architecture. The idea is to be able to mix several types of cores on the same die, including on-die Graphics Processing Units (GPU). To be released in 2009, they will be up against Intel's next chip architecture called Nehalem.
Tip of the iceberg
I've intentionally left the EPIC, SPARC, CELL and POWER architectures out of this. They might be technically more interesting, but they're not x86. Maybe that's something I should get back to another day, if only to make fun of the Itanic or to admire IBM's POWER6 beast. IBM are producing the CPUs for all three new consoles, the Wii, the XBox360, and the PS 3, and there's a lot to say about all of them. I've also avoided talking about what's happening in the rapidly growing System On a Chip world. Mobile computing is exploding and there's many chip manufacturers who want a piece of the action.
I've listed several well written articles at the end of this post if you really want to dig in to details like vector processing instructions, memory bandwidth or the different manufacturing processes, they are all worth reading.
My favorite tech writer, Jon "Hannibal" Stokes have recently released a book called "Inside the Machine". I'm ashamed to say I haven't read the book yet, but it's definitely something I will do as soon as I can. Expect a review of it.
While reading forums and comments to articles about processor architectures I've come up with this modified version of Goodwin's Law, we can call it Mogren's Law:
As an online discussion about processor architecture grows longer, the probability of a comparison involving Alpha or RISC approaches one.The Alpha was a beautiful design though, too bad EV79 and later was canceled. Luckily the clever guys behind the Alpha design were quickly hired to work on other things...
Into the Core: Intel's next-generation microarchitecture - Jon Stokes - April 05, 2006
Inside Barcelona: AMD's Next Generation - David Kanter - March 16, 2007
Intel Core 2 Duo - Franck Delattre and Marc Prieur - June 22, 2006
Core 2 Duo and the future of Intel - Rob Hallock - November 5, 2006
AMD K10 Micro-Architecture - Yury Malich - August 17, 2007
The Common System Interface: Intel's Future Interconnect - David Kanter - August 28, 2007
Intel's new vPro: two steps forward for x86... as well as for DRM and P2P? - Jon Stokes - August 27, 2007
2007-08-19
OpenMoko 2007.2
A few weeks ago I ordered the Neo1973 phone and have been playing around with it a bit since. The software is currently early alpha state and often fails to even build correctly, but that's what we all expect at this early phase of development.
The easiest way to build a complete image to flash on the phone has been to use Rod Whitby's MokoMakefile. Basically all you have to write is "make openmoko-devel-image" and wait a few hours. To compile the whole OpenMoko distribution from scratch takes about 10 hours on my AMD64 3000+, so it's not something you just do in a heartbeat.
The OpenMoko Wiki is really working well and it's still expanding quickly. #openmoko on Freenode is also a very busy channel, currently around 320 people in it. There's always someone there to point you in the right direction if there's some trouble with your Neo.
I have yet to make any applications on my own to install or include in the rootfs, but I'm trying to keep up with the latest development and I'm still learning how bitbake works.
I haven't even been able to make a call with the phone yet, or been able to use the GPS. But that's fine for now, I did order a development sample of a phone. The battery on the Neo doesn't last long either since the power management isn't really working yet. Basically it can be on for a few hours, then it dies completely.
The screen looks great though, and the size is nice. The touchscreen feels good, but is a bit difficult to use near the edges. It's definitely a bit slow to use still, but the GTA02 version (mine is GTA01) will have some nice additions, including WiFi, 3D Graphics accelerators and a faster processor.
There's still a lot of people ordering the developer version of the phone and I'm sure that we'll see a lot of interesting applications fairly quickly. I do worry a bit about the stability and quality of it though. Hopefully they will lock down a stable release in good time before the public release of the phone so that there's enough time to test and bugfix without introducing more problems.
I have no doubt that there will be lots and lots of games and utilities for this phone in a years time. The Neo1973 is still only for tinkerers who like to mess around, but the future is promising.
The easiest way to build a complete image to flash on the phone has been to use Rod Whitby's MokoMakefile. Basically all you have to write is "make openmoko-devel-image" and wait a few hours. To compile the whole OpenMoko distribution from scratch takes about 10 hours on my AMD64 3000+, so it's not something you just do in a heartbeat.
The OpenMoko Wiki is really working well and it's still expanding quickly. #openmoko on Freenode is also a very busy channel, currently around 320 people in it. There's always someone there to point you in the right direction if there's some trouble with your Neo.
I have yet to make any applications on my own to install or include in the rootfs, but I'm trying to keep up with the latest development and I'm still learning how bitbake works.
I haven't even been able to make a call with the phone yet, or been able to use the GPS. But that's fine for now, I did order a development sample of a phone. The battery on the Neo doesn't last long either since the power management isn't really working yet. Basically it can be on for a few hours, then it dies completely.
The screen looks great though, and the size is nice. The touchscreen feels good, but is a bit difficult to use near the edges. It's definitely a bit slow to use still, but the GTA02 version (mine is GTA01) will have some nice additions, including WiFi, 3D Graphics accelerators and a faster processor.
There's still a lot of people ordering the developer version of the phone and I'm sure that we'll see a lot of interesting applications fairly quickly. I do worry a bit about the stability and quality of it though. Hopefully they will lock down a stable release in good time before the public release of the phone so that there's enough time to test and bugfix without introducing more problems.
I have no doubt that there will be lots and lots of games and utilities for this phone in a years time. The Neo1973 is still only for tinkerers who like to mess around, but the future is promising.
2007-08-08
Good programmers and getting things done
Now I'm back from vacation and have been catching up on my blog feeds. I'm sharing all the posts and articles I feel are truly interesting in my Google Reader feed (and page), but the following two deserves a special mention:
Jeff Atwood's "Yes, But What Have You *Done*?" makes me want to "Do it f***ing now"!
The blog post at RevSys called "A Guide to Hiring Programmers: The High Cost of Low Quality" talks about expert programmers and really makes me want to be one.
Both touch on the subject of getting things done and being able to show some finished work. I feel like I'm doing too little at too many places, only touching the surface of the projects and communities I'm involved in and spending too much time reading and talking. Guess it's time to start getting things done!
P.S If anyone cares, I've given in and can now be found on the infamous Facebook.
Jeff Atwood's "Yes, But What Have You *Done*?" makes me want to "Do it f***ing now"!
The blog post at RevSys called "A Guide to Hiring Programmers: The High Cost of Low Quality" talks about expert programmers and really makes me want to be one.
Both touch on the subject of getting things done and being able to show some finished work. I feel like I'm doing too little at too many places, only touching the surface of the projects and communities I'm involved in and spending too much time reading and talking. Guess it's time to start getting things done!
P.S If anyone cares, I've given in and can now be found on the infamous Facebook.
Subscribe to:
Posts (Atom)