Overclocking update

One thing I have learned over the last week is that overclocking is challenging and somewhat addictive. I thought it was a simple tweak of the BIOS settings, but soon you find yourself tweaking the settings to try to get every last MHz out of your processor at the lowest possible voltage settings to avoid frying the motherboard and/or processor. “Frying” is a rather technical term referring to one of two things:

  1. Overheating – raising the processor core temp above its thermal specification rating for extended periods of time
  2. Overvolting – operating the processor at a voltage exceeding the maximum end of the VID Voltage Range

The other potentially bad thing that can happen with overclocking your processor is that your processor can stop operating correctly even while still being able to boot and load an entire operating system. This can trick you into thinking that your overclocked setup is working correctly when it really isn’t. A typical CPU can execute literally billions of instructions per second so if one or two or even twenty of those instructions executes incorrectly, the result may not even be noticable to a typical operating system or typical software applications that you run.

My architecture students should be able to tell you what happens if the clock rate is too high — signals do not have enough time to propagate to their final destination so an incorrect value will be written into memory or into a register before the newly calculated correct value arrives there.

That is why overclockers know that you can’t trust your setup until you run a few good benchmarks and at least one torture test. Two common torture tests that you can download and run for free are prime95 and Intel Burn Test.

I downloaded and installed prime95 and started to run it on what I thought was a stable 3.6 GHz overclock. It lasted less than a minute before crashing with a hardware failure. The way prime95 works is to perform a series of calculations and compare the results with known correct answers that are part of the data distributed with the program. If the calculated answer doesn’t match the known correct answer, then the only explanation is faulty hardware. Also, my CPU core temp started to approach the Intel thermal specification of 71deg celsius very quickly after raising the core voltage to increase stability. This is why overclockers need intricate cooling setups!

So I resolved to scale back my overclocked setup to 3.4 GHz, 800MHz memory speed, which is still a 20% overclock from the stock 2.83Ghz setting. With a core voltage setting of 1.26V, prime95 was able to run for 1 hour before I stopped it with temps just making it to 65deg celsius. I am using a cheap aftermarket air cooler so I am hesitant to let it run for longer without manually keeping an eye on the temp and shutting it down if it makes it up above 71deg celsius.

Prime95 run – just over 5.5 hours with a CPU temp warning
As you can see in the screenshot, Prime95 has made it just over 5.5 hours without error after I left it running overnight. The ASUS software PC Probe II has popped up the temperature warning. By stealing CPU cycles from Prime95 and then idling the CPU during the ensuing context switch, the temperature warning also acts a very minor temperature governor.

ACM Mid-Southeast Conference Today!

After a late start, we made it into Gatlinburg, TN just after 1AM. We got to drive through light snow falling, and see all the beautiful Christmas lights with hardly any traffic on the roads at all. Now it is on to the very serious business of wrapping up my presentation that I am due to be giving in exactly 12 hours. I am close, but not quite finished with my primary experiment. I should have it wrapped up soon and will update this post with a copy of the powerpoint once its ready.

Update – the presentation went well! Here are the powerpoint slides:


Investigating the impact of Ajax on server load in a web 2.0 application powerpoint (3.1MB)

New (to me) research bibliography tool

I have been a long time user of bibliography search engines such as citeseer, the ACM portal, IEEExplore, etc… but I have always turned to standalone software applications such as Endnote and JabRef to manage the articles I find. This summer I created my own web-based annotated bibliography tool and just barely got it off the ground before the need for it subsided. Now, tonight as I am organizing my references for an upcoming conference presentation, I just discovered a tool that surpasses the tool that I created and adds a social networking element. The website is called CiteULike (screenshot below) –
CiteULike social networking for academics

As far as I can tell, the only way to group entries in your library is by creating a special “group” where any registered member can join and post entries or by tagging each entry that you want to belong in a category with a special keyword. I have opted for the latter approach as I only want articles that I post to be in my reference library. The screenshot is showing a post that I just made. The site seems to be pretty response and there is an active bookmark link that you can add to your browser toolbar to quickly create a new bibliography entry from the page you are currently viewing.

Overlocking adventure

I have always been conservative when it comes to the stability of my computer systems. My reasoning is that I have invested too much money in the equipment to ruin it by being performance greedy. Today, though, I read an interesting article on Tom’s Hardware that convinced me to give overclocking a try.

Motivation

I recently upgraded my primary development machine. This new Intel-based system will serve double duty as my production server in the near future. During the current development stage of my e-commerce site, I would like to overclock the server and see how stable it is at a relatively conservative overclock setup. The current limiting factor for my target application is a CPU-intensive program called Global Mapper that spends a majority of its time maxing out the CPU after an initial hit on the hard drive to load data into RAM. This is followed by an erie silence with zero hard drive activity and 100% CPU-load for minutes to hours (for really large maps) while the CPU is being used to perform calculations to determine how best to render the mapping data. Increasing the CPU clock rate directly affects the rate at which maps can be produced – thus overclocking my processor by 20% will net close to a 20% performance boost in this particular application.

The system specifications

  • Intel Core 2 Quad 9550 processor, 2.83GHz, 12 MB cache
  • ASUS P5Q SE/R Motherboard (Intel P45 chipset with ICH10R Southbridge providing RAID)
  • 8GB G.SKill P8500 DDR2 RAM operating at 1066MHz
  • (2) 500GB Seagate Barracuda 7200RPM, 32MB cache hard drives operating in RAID 0
  • 500W Energy efficient Antec earthwatts power supply

The overclocking experience

I began with Internet research to see the experience of other people trying to overclock similar systems. I found this article on Tom’s Hardware particularly helpful: “How would I overclock q9550 to 3.4ghz“. I decided I would first try updating the FSB from its default of 333MHz to 400MHz and then leave all other settings at auto to let the motherboard adjust as needed. Well, I managed to boot up into Vista, but as soon as I loaded up Global Mapper and tried to import a large dataset, I had a complete system lock-up. I suspect that the core voltage for the CPU was not raised sufficiently by the motherboard to adjust for the faster clock rate.

After a reboot, and a BIOS update which was required to interact with the latest version of ASUS AI Suite for monitoring various CPU and motherboard sensors as well as dynamically changing BIOS settings without having to reboot the computer, I had the system up and running with the following manual settings entered into the BIOS:

  • CPU vcore: 1.28125
  • FSB: 400MHz
  • Memory speed: 400MHz (800 effective b/c of dual-channel interface)
  • Memory timings: 4-4-4-12
  • Memory voltage: 1.8 (matches memory specs for the GSkill memory)

The results

The screenshots below tell the story: overclocked at 3.4 GHz, able to successfully complete the PC Mark Vantage x64 test suite, memory running in dual channel mode at an effective 800 MHz. So far the system has been up and running for a few hours with no more lock-ups. I plan to run prime95 tonight to see if it can make it all the way through. Since a number of people have reported overclocking this setup to 4 GHz and beyond, I am not too worried!

CPUZ screenshots – View the online validator results.


Modest improvement of 7.5% over the non overclocked setup (not shown, pc mark results = 6934)

The reason for the limited 7.5% improvement even though my clock speed has been upgraded by 20% is because PC Mark vantage stresses hard drive access times, for which I only have a mediocre setup — even though I have incredible transfer rates from the drive to memory. A more telling figure is the time required to generate the exact same map in the overclocked setup vs the original non-overclocked setup. Generating the color-coded shaded relief map for the entire state of Colorado took 15.9 minutes in the original setup. With the overclocked setup, the same exact map only took 13.5 minutes — almost 18% faster!
Continue reading

Computer upgrade: ATI HD 4670 graphics card

The last component of my computer upgrade arrived today in a surprisingly small box. I purchased an ATI HD 4670 graphics card based on the Tom’s hardware recommendation for entry level graphics cards. I didn’t have a lot of money to spend on this upgrade, but yet I wanted a card that would allow me to harness some of the parallel processing power via GPGPU programming. The HD 4670 has 320 streaming processors, which I am hoping to experiment with to increase the map rendering speed for topocreator.com.

After installing the card, I updated my Windows Vista Experience Index and it went from 3.3 up to a whopping 5.8 (5.9 is the maximum possible score). Note that my Primary hard disk – (2) 500 gb seagate drives in RAID 0 configuration – is rated at 5.9. This is significant for my PC Mark Vantage experiment further down in this post.


Windows Vista Experience Index – 5.9 is the highest score possible

Next, I re-ran PC Mark Vantage and was hoping for a figure somewhere in the 8000’s. Instead, I saw an increase from 6144 to 6655 – much lower than expected. Further investigation revealed that tests involving my HDD (hard disk drive) performance had very low scores. I discovered that Windows search was indexing my drive in the background, so I disabled Windows search and re-ran the benchmark. This saw a further increase to 6934 with slighly improved HDD performance. See screenshots below:

Continue reading

Computer upgrade: one week update

check out the coffee cups

The computer upgrade has gone really well, but it has been quite an adventure. The coffee cups on the table tell the story — lots of early mornings and late nights! The early results have been very promising on two fronts: 1) Individual maps can be produced two to three times faster than the original machine. 2) I was inspired during the upgrade to write a workshop paper for ICSE 2009 documenting my usage of Microsoft Office OneNote to bring topocreator.com from idea to live e-commerce website!

All that is left at this point is to replace the graphics card I am currently borrowing with ATI’s top “entry-level” card, the ATI HD 4670, which should be arriving on Tuesday. Since ATI is now owned by AMD, this is a great way of spreading the love around between processor manufacturers since everything else in the system is Intel-based.

Computer upgrade status

On Tuesday, the new 500GB seagate hard drives arrived at the house. This was to be my first experience setting up drives in a RAID configuration let alone installing a windows operating system onto it — definitely a learning experience!

Desired setup:

  • RAID 0 (performance) using two 500GB hard drives for a total capacity of 1TB
  • Windows Vista Business 64-bit edition with 8GB of RAM

The installation proceeded with me disconnecting all cables, opening the case, and loading the two hard drives into an internal bay in the case. This was the easy part – 30 minutes if you include all the unwrapping, unpacking, etc.

Next, it was time to use the BIOS utility to create the RAID volume. This was also a snap, once I figured out that you had to hit tab to see the POST messages before hitting the designated key combination of Ctrl+i to access the hidden RAID utility built into the BIOS. Again, no problem — 10 minutes with a couple reboots.

Finally, I was ready for the Windows Vista installation. This is where I ran into a number of problems with the installer. I had already determined from my initial research that there is no “winging it” with the windows installation. You MUST have the correct driver for your RAID controller – there is no generic controller included with the Windows installation media that works for a RAID drive — if the driver for your particular RAID controller is not included in the Windows installation package, then you must have your manufacturer’s driver to proceed with the installation. There is a special screen in the Windows Vista installation where you can load a driver for asking your hard drive. I was able to download the driver onto a USB flash drive and have the Windows installer pick up that driver, but I was still receiving a “Windows cannot find a volume that meets system requirements” error. I tried several different versions of the driver, but none of them worked. Finally, I tried updating the BIOS even though the factory-installed BIOS was only a few months old. Bingo! The install worked perfectly on the next attempt!

Once I had everything up and running, I was able to run PC Mark Vantage and this time, I did notice a marked improvement over my current development machine — almost twice as fast! Also my Windows Vista user experience score maxed out the Processor, Memory, and Primary hard disk categories with a 5.9 in each (the maximum rating). My graphics and gaming graphics are still weak, but my new entry level gaming graphics card ATI HD4670 is going to be here next week and that should take care of that. I am hoping to dive into the field of GPGPU (general-purpose computation using graphics processing units integrated into modern graphics cards) and take advantage of the extreme parallelism found in modern graphics cards to create maps even faster.

Live blogging a computer upgrade

It’s Saturday morning, and I am in the process of upgrading an old computer I have to help support the development of topocreator.com. This website has inspired me for a number of research projects along the way to becoming (hopefully) a successful e-commerce website. The backend program I am using to create maps is quite CPU intensive (0% hard drive activity after initial data load, 100% cpu usage for minutes to hours depending on the size of the map).

The computer I am upgrading is about 7 years old and is one that I built while a grad student at UC Davis. The specs for it are: Dual Pentium III 1 GHz processors with 512MB of RAM (the maximum allowed by the motherboard). The only components that will remain after the upgrade will be the case, an old graphics card, and a cd-writer. All other components will be replaced. The upgraded system will have the following components:

  • Intel Core 2 Quad 9550 processor, 2.83GHz, 12 MB cache
  • ASUS P5Q SE/R Motherboard (Intel P45 chipset with ICH10R Southbridge providing RAID)
  • 8GB G.SKill P8500 DDR2 RAM operating at 1066MHz
  • (2) 500GB Seagate Barracuda 7200RPM, 32MB cache hard drives operating in RAID 0
  • 500W Energy efficient Antec earthwatts power supply
  • A very old cd-writer that still works (hopefully)
  • A new lightscribe DVD writer, CD writer combo drive
1 – Analise and Josiah help unpack the box from newegg.com 2 – All the new components are laid out on the table 3 – The current setup showing the computer to be replaced on the bottom

11:15AM So that brings us to where we are now. I am making a rough outline of the next steps to tackle as detailed below:
Continue reading