A few days ago, I did installed SUSE ver 10.0 64-bit edition to my Compaq Presario R3000z laptop. After backing up all the data to my new SEAGATE external hard drive (its 200 GB, so has plenty of space), I resized the Linux (reiserfs) partition from 15 GB to 25 GB, plus I added an extra 5 GB FAT32 partition to share files between Linux and Windows XP (in case I use Windows :-).
The standard SUSE does not came with NVIDIA's binary-only enhanced driver, so the graphic was standard. I then downloaded the 64bit driver from nvidia.com, but had problem installing it. Somehow downloading the 64bit driver from YaST worked out. I don't know what was the problem, but anyway it work OK so I could change the resolution to 1680x1050 and the graphic was fine.
Another problem was the wireless device. Linux 64bit does not like 32bit driver, so the old Broadcom driver for WinXP did not work. After googling up, I found the driver for Win 64bit (somebody posted it at www.planetamd64.com. You must register to the site to be able to download it, but it is free registration). Yet, the ndiswrapper still could not load the driver. From dmesg result, I saw there was one error saying "ntoskernel.exe: RtlZeroMemory" was an undefined function. Hm...
One problem was that the *.conf under /etc/ndiswrapper/bcmwl5 had different devid. I checked again using lspci -n showed a different devid. I just created a symbolic link to the existing conf file, but the link file name is now has the correct devid. Still ndiswrapper failed to load the driver. Oh yeah, I forgot to mention that I always set the environment variable CFLAGS="-mtune=athlon64 -msse2 -mfpmath=sse -ffast-math -m64", as well as CPPFLAGS.
I googled again, and found one posted message with the same problem. The guy later on said that he then upgraded the ndiswrapper to the latest one and it worked. I then gave it a shot. You know what? It was working!
The next tasks for me is to recompile xine (the one that comes in SUSE Linux does not support reading DCSS-encryption to read commercial DVD movies), also I need to recompile some others to the 64-bit.
Thanks to Google, I had successfuly make my laptop wireless work! I am getting more in love into Linux now. I rarely boot my laptop to Windows anymore. Only when I need to run "Age of Empir
Sunday, December 18, 2005
Friday, December 2, 2005
It is now Era of Parallel Computing!
Everything is now done in parallel! AMD and Intel are pacing to get first to the market with their dual-core, quad-core or even octa-core processors, although I believe AMD won the 64bit and dual-core in the first cycle. Xbox360 which was released a few weeks ago (and still hard to find in the stores!) also use parallel processing capabilities. Sony's PS3 even will go farther by using 8 cores [well, it may not really like 8-cores in conventional processor. See http://www.ps3land.com/ps3specs.php].
The hardwares have come! Now it is turn for sofware developers to use this parallel computing power or not. That's the biggest concern now, especially on PC world. Many developers, many of them are making business sofwares, are still thinking parallel processing is overkill for most of softwares people use in the offices or homes. To edit documents, browsing or reading/sending emails, we do not need parallel computation. That may be true. But for power home users who do video editing, transcoding media files (e.g. from WAV to MP3 or MP3 to OGG format, or DVD to DivX/MPEG4 format), or doing 3D related processing (games, vector-based applications, CAD, raytracing etc.), parallel processing is a big deal.
Programming parallel computation is not an easy task. It is very complex and mindsqueezing task. First, we have to decompose computation process/function from serial to parallel into "grains" (the smallest part of computation that cannot be decomposed anymore) and maximize independences between each grains. Then, these grains have to be distributed to each processing unit. We need also to consider communication and processing time for each processing unit, especially if we want to load-balance the task. To make the story short, it is non-easy-at-all job for software developers.
A few weeks ago I read on some magazine (forgot, may be E-Weeks or InfoWeek), Microsoft now turning its eyes into parallel computation. In one conference, Bill Gates told the forum about it. This is a totally new world for Microsoft, as this is mostly dominated by UNIX-based Operating systems (including Linux). Even the fastest supercomputer, IBM's Bluegene, runs on a custom-tailored Linux O/S.
If you are a software developer, learn parallel programming now to grab this new job opportunity! Google, Microsoft, Intel, AMD, Sun, IBM, Nanotechnology companies, research labs, BioTech/pharmaceutical companies, game studios and many other companies are looking for the talents to start coding their parallel-processing computers/system. Remember Folding@HOME or SETI? these are a few of the parallel-programming tasks you need to master.
The hardwares have come! Now it is turn for sofware developers to use this parallel computing power or not. That's the biggest concern now, especially on PC world. Many developers, many of them are making business sofwares, are still thinking parallel processing is overkill for most of softwares people use in the offices or homes. To edit documents, browsing or reading/sending emails, we do not need parallel computation. That may be true. But for power home users who do video editing, transcoding media files (e.g. from WAV to MP3 or MP3 to OGG format, or DVD to DivX/MPEG4 format), or doing 3D related processing (games, vector-based applications, CAD, raytracing etc.), parallel processing is a big deal.
Programming parallel computation is not an easy task. It is very complex and mindsqueezing task. First, we have to decompose computation process/function from serial to parallel into "grains" (the smallest part of computation that cannot be decomposed anymore) and maximize independences between each grains. Then, these grains have to be distributed to each processing unit. We need also to consider communication and processing time for each processing unit, especially if we want to load-balance the task. To make the story short, it is non-easy-at-all job for software developers.
A few weeks ago I read on some magazine (forgot, may be E-Weeks or InfoWeek), Microsoft now turning its eyes into parallel computation. In one conference, Bill Gates told the forum about it. This is a totally new world for Microsoft, as this is mostly dominated by UNIX-based Operating systems (including Linux). Even the fastest supercomputer, IBM's Bluegene, runs on a custom-tailored Linux O/S.
If you are a software developer, learn parallel programming now to grab this new job opportunity! Google, Microsoft, Intel, AMD, Sun, IBM, Nanotechnology companies, research labs, BioTech/pharmaceutical companies, game studios and many other companies are looking for the talents to start coding their parallel-processing computers/system. Remember Folding@HOME or SETI? these are a few of the parallel-programming tasks you need to master.
64bit or 32bit?
Novell's SUSE has released its Linux distribution to the latest 10.0 (and 10.1 is underway in the development). At opensuse.org, I saw there is also ISO file available for Linux for 64bit processors. A few weeks ago I downloaded 5 ISO files for installation CDS from opensuse.org (and just recently converted them to a single DVD. Just follow instructions in http://www.opensuse.org/Making_a_DVD_from_CDs).
I booted up my AMD64 laptop with the DVD, but then I changed my mind. Not to upgrade my Linux now and stayed with the older version (SUSE 9.3). One of the reasons is that, my Linux partition is too small for the 64bit (I have only 12 GB out of 80GB HD of my hard drive for Linux, the rest is for WinXP). I searched the internet and found some discussions saying this 64bit requires almost double as much space as 32bit version. This is because there is /lib64 and /usr/lib64, in addition to /lib or /usr/lib, for 64bit so we can run applications with both versions. Also, 64bit files are generally larger than 32bit, because some of the processor's instructions require extra bytes to handle 64bit operations.
Sometime ago, I read on the Internet that ndiswrapper might not work in 64bit environment. But I saw there was 64bit version of it on the CD, so I believe it is now supported. This was another reason I was scared to upgrade it in the beginning. Also, NVIDIA and ATI now support 64bit version of their graphic processors. The remaining issues are minors, I guess.
What are the benefits of using 64bit system? OK, first of all, it can handle huge memory space (hundreds of terabytes, instead of just 4 GB as on 32bit version). Another thing is, theoritically it should be faster. Why, you might ask? Because the processor can transfer data as twice as much in the same duration than the 32bit system. Also, security gets a big benefit because now it can directly handle long integer computation directly down to the machine code. I recall, with 32bit environment, we need to handle large integer computation manually using some algorithms.
Anyway, this is just my hypothesis. I have not tested it yet nor benchmarked it. I will do it soon, but I need to get my Seagate's 200 GB external Hard drive first to backup all my data before doing this. Will post it here as soon as I've done it.
I booted up my AMD64 laptop with the DVD, but then I changed my mind. Not to upgrade my Linux now and stayed with the older version (SUSE 9.3). One of the reasons is that, my Linux partition is too small for the 64bit (I have only 12 GB out of 80GB HD of my hard drive for Linux, the rest is for WinXP). I searched the internet and found some discussions saying this 64bit requires almost double as much space as 32bit version. This is because there is /lib64 and /usr/lib64, in addition to /lib or /usr/lib, for 64bit so we can run applications with both versions. Also, 64bit files are generally larger than 32bit, because some of the processor's instructions require extra bytes to handle 64bit operations.
Sometime ago, I read on the Internet that ndiswrapper might not work in 64bit environment. But I saw there was 64bit version of it on the CD, so I believe it is now supported. This was another reason I was scared to upgrade it in the beginning. Also, NVIDIA and ATI now support 64bit version of their graphic processors. The remaining issues are minors, I guess.
What are the benefits of using 64bit system? OK, first of all, it can handle huge memory space (hundreds of terabytes, instead of just 4 GB as on 32bit version). Another thing is, theoritically it should be faster. Why, you might ask? Because the processor can transfer data as twice as much in the same duration than the 32bit system. Also, security gets a big benefit because now it can directly handle long integer computation directly down to the machine code. I recall, with 32bit environment, we need to handle large integer computation manually using some algorithms.
Anyway, this is just my hypothesis. I have not tested it yet nor benchmarked it. I will do it soon, but I need to get my Seagate's 200 GB external Hard drive first to backup all my data before doing this. Will post it here as soon as I've done it.
Saturday, October 15, 2005
Big Databases
InformationWeek last month had a column mentioning who/what companies that have the biggest database in their servers. The following is the top ten commercial databases with the gigantic size:
Or, may be it is has less than 20 million users in US? Likely.
With current commodity hard disks available in the market, a 400 GB HD can be bought at around $300 (I just checked shopping.com, it actually costs less than that), so with 3 drives we could get 1.2 TB storage space with cost only $900. To get 120 TB, it costs only $90,000.
To see the full detail, see www.wintercorp.com
- Yahoo, 100.4 TB. Platform: Unix
- AT&T Lab, 93.9 TB. Platform: Unix
- KT IT-Group, 49.4 TB. Platform: Unix
- AT & T Lab, 26.7 TB. Platform: Unix
- LGR-Cingular, 25.2 TB, Platform: Unix
- Amazon.com, 24.8 TB, Platform: Linux
- Anonymous, 19.7 TB. Platform: Unix
- Unisys, 19.5 TB. Platform: Windows
- Amazon.com, 18.6 TB. Platform: Linux
- Nielsen Media, 17.7 TB. Platform: Unix
Or, may be it is has less than 20 million users in US? Likely.
With current commodity hard disks available in the market, a 400 GB HD can be bought at around $300 (I just checked shopping.com, it actually costs less than that), so with 3 drives we could get 1.2 TB storage space with cost only $900. To get 120 TB, it costs only $90,000.
To see the full detail, see www.wintercorp.com
Thursday, October 13, 2005
Free Wi-Fi Access
Google has started beta testing of its free Wi-Fi service in a few San Francisco locations. I don't know how the access to users look like, but the big concern for us is how secure it is to people, especially when people access sensitive data (e.g. e-commerce, etc.)
Anyway, this is a good news and I would admit to say that Google is one of cool companies to work for. Kudos to Google for its initiative!
Anyway, this is a good news and I would admit to say that Google is one of cool companies to work for. Kudos to Google for its initiative!
Friday, September 23, 2005
Firmwares for WRT54G
OK, now I am going to tell you about a few firmwares that crash my linksys (luckily I have revived the router, thanks to the instructions posted at http://voidmain.is-a-geek.net/redhat/wrt54g_revival.html.
The working firmware on my router is Tarifa 0003. Unfortunately, I have never been able to make it as wireless bridge (WDS), eventhough I follow the instructions from the Internet (I guess it was at www.linksysinfo.org).
Anyway, after failing to flash the nvram with another firmware, I was able to recover my router which almost become a brick. :-)
The working firmware on my router is Tarifa 0003. Unfortunately, I have never been able to make it as wireless bridge (WDS), eventhough I follow the instructions from the Internet (I guess it was at www.linksysinfo.org).
Anyway, after failing to flash the nvram with another firmware, I was able to recover my router which almost become a brick. :-)
Subscribe to:
Posts (Atom)