Sun Microsystems VirtualBox

Wow – Sun microsystems has been busy – discreetly releasing and/or acquiring all kinds of important open source software projects. The banner of logos shown above just about summarizes it.

Today I discovered VirtualBox while working on a cluster computing project. VirtualBox isn’t directly related to cluster computing, but it can be used to run multiple compute nodes for testing. VirtualBox is an open-source equivalent to VMWare’s popular VMWare Workstation product. With the performance of today’s hardware, the ability to run multiple operating systems simultaneously on a single machine is becoming a reality for more and more people.

The basic idea is that by installing VirtualBox you install an application and a small set of services that allow you to create virtual machines for running an entirely different operating system in a window on your host operating system. I have just completed an install of the latest version of the Debian OS into a virtual machine. The entire process (including the several hundred megabyte download) took less than one hour to complete. Now I can boot up a Linux operating system whenever I want to run an application only available in Linux ( e.g., Kmines 🙂 ).

The screenshots below give you a glimpse into how it works. The first screenshot shows the virtual machine configuration options, which represents everything you would find on a real machine. The second shows Linux running in a window on my Windows Vista host operating system.

VirtualBox configuration options for my Debian Linux virtual machine Debian Linux running in a virtual machine powered by VirtualBox

Going live with Mesa Online

This morning we went live with Mesa Online, a website for students taking Spanish courses in the World Languages department. The idea behind the Mesa program is that students sign-up for a time to have lunch or dinner with a small group of students and one faculty member. During the lunch or dinner, all conversation must be in Spanish. The previous sign-up system involved a large list of timeslots posted on the wall. My Fall 2008 software engineering class took upon themselves the task of converting the paper-based system to an online system. I helped the students bring the project to a conclusion just in time for it to go live today for the Spring 2009 semester. As of 3:00PM, exactly 100 students have created their Mesa accounts. There is still work left to be done on the administrative part of the website, but I will post updates as the semester progresses and as we see whether the system effectively meets the needs of the World Languages Department here at Samford.

Welcome to Spring 2009

ul{margin-left:25px}

Welcome to another exciting semester at Samford University! Here is a quick summary list of the exciting classes and work just around the corner!

  • COSC325 – emphasizing web languages to demonstrate concepts of programming languages
  • COSC495 – senior seminar – details of student projects to be posted soon!
  • SUVS – Samford University Virtual Supercomputer

Under hacker attack!

Update – Apparently it was about May of this year when there was a large surge in ssh password attacks. I believe that my computer became a target sometime after that. Here are some good articles reporting on the situation:

Brute-Force SSH Server Attacks Surge” by InformationWeek

Brute-force SSH attacks surge by SC Magazine

This may not be news to many of you, but my new home development machine is under attack! This isn’t your typical script kiddie HTTP attack, but rather a full-blown SSHD password guessing attack. Unfortunately, I did not take screenshots of everything as I detected the attack (which has been going on for about two weeks now) but I do have a few screenshots to help describe the timeline of events:


1 – I opened process explorer (an excellent replacement for the Windows Task Manager) to investigate my current cpu usage and running processes. The screenshot above doesn’t show it because I didn’t take a screenshot at the time, but what drew my attention to a possible attack was multiple sshd.exe processes appearing and then disappearing (brightly colored in red to indicate that the process was marked for destruction). My immediate instinct was that somebody was making connections and attempting to guess a password!


2 – I then instinctively (i.e., immediately and as fast as I could) opened a command prompt and typed the command netstat -a which shows the list of active TCP connections. Sure enough, there was a number of connections to static-217-133-194-98.clienti.tiscali.it


3 – Next I decided to see if the event viewer had recorded any activity. Wow! Over 30,000 events relating to sshd activity. The screenshot above shows the very first event recording a break-in attempt. On the evening of November 25, I switched my hardware firewall to redirect all port 22 SSH requests to my new computer. The next morning at 11:55:19 AM, the first attack commenced and proceeded to send a new username/password login attempt every 8 seconds for just over 1.5 hours ending at 1:19:19 PM. The attack sequence generated 2489 entries in the event viewer. You can see that the entry records a failed password guess for non-existent user root. The attacking computer then tried a different password before switching to a new user account ftp. Again, this is a non-existent user account. Then the user tried a second time with this user account before switching to another account: sales.

Continue reading

Final exams

I am in the process of giving my Intro to Computer Science exam right now with two more exams to go after this one. I thought I would take a minute to update on a number of projects in the works:

  • Ajax Performance Toolkit – I am in the final stages of getting ready to release this to web developers under the GNU General Public License (GPL). This “plug-and-play-and-configure” software allows a web developer to insert a small segment of code onto any web page to monitor the performance of Ajax requests being generated and the responses being received from a web server as well as the current load on the web server. Click on the screenshot below to see a larger image showing the toolkit applied to a page that retrieves the elevation under the cursor by sending an Ajax request to the server every time the mouse moves.
  • Overclocked Q9550 processor – back up to 3.78GHz running at 1.38 core voltage. I invested the money on a nice processor, nice motherboard, why not use its full potential? A color-coded shaded relief map of the entire state of Colorado can be generated over 20% faster with the overclocked processor as opposed to the stock setup. Here is the updated PC Mark Vantage results.

Overclocking update

One thing I have learned over the last week is that overclocking is challenging and somewhat addictive. I thought it was a simple tweak of the BIOS settings, but soon you find yourself tweaking the settings to try to get every last MHz out of your processor at the lowest possible voltage settings to avoid frying the motherboard and/or processor. “Frying” is a rather technical term referring to one of two things:

  1. Overheating – raising the processor core temp above its thermal specification rating for extended periods of time
  2. Overvolting – operating the processor at a voltage exceeding the maximum end of the VID Voltage Range

The other potentially bad thing that can happen with overclocking your processor is that your processor can stop operating correctly even while still being able to boot and load an entire operating system. This can trick you into thinking that your overclocked setup is working correctly when it really isn’t. A typical CPU can execute literally billions of instructions per second so if one or two or even twenty of those instructions executes incorrectly, the result may not even be noticable to a typical operating system or typical software applications that you run.

My architecture students should be able to tell you what happens if the clock rate is too high — signals do not have enough time to propagate to their final destination so an incorrect value will be written into memory or into a register before the newly calculated correct value arrives there.

That is why overclockers know that you can’t trust your setup until you run a few good benchmarks and at least one torture test. Two common torture tests that you can download and run for free are prime95 and Intel Burn Test.

I downloaded and installed prime95 and started to run it on what I thought was a stable 3.6 GHz overclock. It lasted less than a minute before crashing with a hardware failure. The way prime95 works is to perform a series of calculations and compare the results with known correct answers that are part of the data distributed with the program. If the calculated answer doesn’t match the known correct answer, then the only explanation is faulty hardware. Also, my CPU core temp started to approach the Intel thermal specification of 71deg celsius very quickly after raising the core voltage to increase stability. This is why overclockers need intricate cooling setups!

So I resolved to scale back my overclocked setup to 3.4 GHz, 800MHz memory speed, which is still a 20% overclock from the stock 2.83Ghz setting. With a core voltage setting of 1.26V, prime95 was able to run for 1 hour before I stopped it with temps just making it to 65deg celsius. I am using a cheap aftermarket air cooler so I am hesitant to let it run for longer without manually keeping an eye on the temp and shutting it down if it makes it up above 71deg celsius.

Prime95 run – just over 5.5 hours with a CPU temp warning
As you can see in the screenshot, Prime95 has made it just over 5.5 hours without error after I left it running overnight. The ASUS software PC Probe II has popped up the temperature warning. By stealing CPU cycles from Prime95 and then idling the CPU during the ensuing context switch, the temperature warning also acts a very minor temperature governor.

ACM Mid-Southeast Conference Today!

After a late start, we made it into Gatlinburg, TN just after 1AM. We got to drive through light snow falling, and see all the beautiful Christmas lights with hardly any traffic on the roads at all. Now it is on to the very serious business of wrapping up my presentation that I am due to be giving in exactly 12 hours. I am close, but not quite finished with my primary experiment. I should have it wrapped up soon and will update this post with a copy of the powerpoint once its ready.

Update – the presentation went well! Here are the powerpoint slides:


Investigating the impact of Ajax on server load in a web 2.0 application powerpoint (3.1MB)

New (to me) research bibliography tool

I have been a long time user of bibliography search engines such as citeseer, the ACM portal, IEEExplore, etc… but I have always turned to standalone software applications such as Endnote and JabRef to manage the articles I find. This summer I created my own web-based annotated bibliography tool and just barely got it off the ground before the need for it subsided. Now, tonight as I am organizing my references for an upcoming conference presentation, I just discovered a tool that surpasses the tool that I created and adds a social networking element. The website is called CiteULike (screenshot below) –
CiteULike social networking for academics

As far as I can tell, the only way to group entries in your library is by creating a special “group” where any registered member can join and post entries or by tagging each entry that you want to belong in a category with a special keyword. I have opted for the latter approach as I only want articles that I post to be in my reference library. The screenshot is showing a post that I just made. The site seems to be pretty response and there is an active bookmark link that you can add to your browser toolbar to quickly create a new bibliography entry from the page you are currently viewing.

Overlocking adventure

I have always been conservative when it comes to the stability of my computer systems. My reasoning is that I have invested too much money in the equipment to ruin it by being performance greedy. Today, though, I read an interesting article on Tom’s Hardware that convinced me to give overclocking a try.

Motivation

I recently upgraded my primary development machine. This new Intel-based system will serve double duty as my production server in the near future. During the current development stage of my e-commerce site, I would like to overclock the server and see how stable it is at a relatively conservative overclock setup. The current limiting factor for my target application is a CPU-intensive program called Global Mapper that spends a majority of its time maxing out the CPU after an initial hit on the hard drive to load data into RAM. This is followed by an erie silence with zero hard drive activity and 100% CPU-load for minutes to hours (for really large maps) while the CPU is being used to perform calculations to determine how best to render the mapping data. Increasing the CPU clock rate directly affects the rate at which maps can be produced – thus overclocking my processor by 20% will net close to a 20% performance boost in this particular application.

The system specifications

  • Intel Core 2 Quad 9550 processor, 2.83GHz, 12 MB cache
  • ASUS P5Q SE/R Motherboard (Intel P45 chipset with ICH10R Southbridge providing RAID)
  • 8GB G.SKill P8500 DDR2 RAM operating at 1066MHz
  • (2) 500GB Seagate Barracuda 7200RPM, 32MB cache hard drives operating in RAID 0
  • 500W Energy efficient Antec earthwatts power supply

The overclocking experience

I began with Internet research to see the experience of other people trying to overclock similar systems. I found this article on Tom’s Hardware particularly helpful: “How would I overclock q9550 to 3.4ghz“. I decided I would first try updating the FSB from its default of 333MHz to 400MHz and then leave all other settings at auto to let the motherboard adjust as needed. Well, I managed to boot up into Vista, but as soon as I loaded up Global Mapper and tried to import a large dataset, I had a complete system lock-up. I suspect that the core voltage for the CPU was not raised sufficiently by the motherboard to adjust for the faster clock rate.

After a reboot, and a BIOS update which was required to interact with the latest version of ASUS AI Suite for monitoring various CPU and motherboard sensors as well as dynamically changing BIOS settings without having to reboot the computer, I had the system up and running with the following manual settings entered into the BIOS:

  • CPU vcore: 1.28125
  • FSB: 400MHz
  • Memory speed: 400MHz (800 effective b/c of dual-channel interface)
  • Memory timings: 4-4-4-12
  • Memory voltage: 1.8 (matches memory specs for the GSkill memory)

The results

The screenshots below tell the story: overclocked at 3.4 GHz, able to successfully complete the PC Mark Vantage x64 test suite, memory running in dual channel mode at an effective 800 MHz. So far the system has been up and running for a few hours with no more lock-ups. I plan to run prime95 tonight to see if it can make it all the way through. Since a number of people have reported overclocking this setup to 4 GHz and beyond, I am not too worried!

CPUZ screenshots – View the online validator results.


Modest improvement of 7.5% over the non overclocked setup (not shown, pc mark results = 6934)

The reason for the limited 7.5% improvement even though my clock speed has been upgraded by 20% is because PC Mark vantage stresses hard drive access times, for which I only have a mediocre setup — even though I have incredible transfer rates from the drive to memory. A more telling figure is the time required to generate the exact same map in the overclocked setup vs the original non-overclocked setup. Generating the color-coded shaded relief map for the entire state of Colorado took 15.9 minutes in the original setup. With the overclocked setup, the same exact map only took 13.5 minutes — almost 18% faster!
Continue reading

Computer upgrade: ATI HD 4670 graphics card

The last component of my computer upgrade arrived today in a surprisingly small box. I purchased an ATI HD 4670 graphics card based on the Tom’s hardware recommendation for entry level graphics cards. I didn’t have a lot of money to spend on this upgrade, but yet I wanted a card that would allow me to harness some of the parallel processing power via GPGPU programming. The HD 4670 has 320 streaming processors, which I am hoping to experiment with to increase the map rendering speed for topocreator.com.

After installing the card, I updated my Windows Vista Experience Index and it went from 3.3 up to a whopping 5.8 (5.9 is the maximum possible score). Note that my Primary hard disk – (2) 500 gb seagate drives in RAID 0 configuration – is rated at 5.9. This is significant for my PC Mark Vantage experiment further down in this post.


Windows Vista Experience Index – 5.9 is the highest score possible

Next, I re-ran PC Mark Vantage and was hoping for a figure somewhere in the 8000’s. Instead, I saw an increase from 6144 to 6655 – much lower than expected. Further investigation revealed that tests involving my HDD (hard disk drive) performance had very low scores. I discovered that Windows search was indexing my drive in the background, so I disabled Windows search and re-ran the benchmark. This saw a further increase to 6934 with slighly improved HDD performance. See screenshots below:

Continue reading