ubuntu – Terminally Incoherent http://www.terminally-incoherent.com/blog I will not fix your computer. Wed, 05 Jan 2022 03:54:09 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.26 Unity is not Great http://www.terminally-incoherent.com/blog/2013/05/08/unity-is-not-great/ http://www.terminally-incoherent.com/blog/2013/05/08/unity-is-not-great/#comments Wed, 08 May 2013 14:09:01 +0000 http://www.terminally-incoherent.com/blog/?p=14329 Continue reading ]]> About two weeks ago my work laptop died. The motherboard just bricked itself to pieces and there was no rescuing it. As my old machine was old and decrepit, and I was going to be replacing it with something with an entirely different hard drive profile I opted to do a clean install of Ubuntu 12.4. Previously I’ve been running Kubuntu 10.4 and it’s been high time to move on.

Why not 13.4 you ask? Well, I’m in my 30’s now which means I’m living on an accelerated time-space curve now. Time simply flows faster than it did when I was in my teens or twenties. For example, I recently refereed to an event that happened back in 2010 as “the other day” because that’s honestly how it felt to me. From that you can probably extrapolate that I am not a huge fan of upgrading my OS every other day (which is roughly how 6 months seems to me). I can hardly give half of a fuck what quirky adjective-animal combination is the most up-to-date one. I just want something semi-stable that is regularly patched, and that has a decent sized package library. I’ve ran non-LTS versions of the OS before, and I was always unpleasantly surprised when Canonical end-of-lifed them and “vanished” the repositories leaving me stranded every 6 months or so. To put it plainly, anything that is not an LTS is not worth my time.

12.4 ships with Unity – Cannonical’s new window manager and desktop environment. There has been a lot of discussion about it when it was first released, and it seems that Ubuntu community has been torn apart between those who like it and those who hate it. Previously I didn’t really have an opinion in the discussion, because I haven’t really used it for anything substantial. I of course checked it out back when it was new, and it looked kinda sleek and a tad bit showy for my tastes but I didn’t outright hate it.

With my switch to 12.4 I decided to give it a fair shake and see how it performs in regular day-to-day usage. Unfortunately, it turns out that all the haters were right: Unity is not that great. It is style over substance. It seems to try so hard to be OSX like that it makes things that used to be simple needlessly difficult.

Here are my biggest complaints about Unity in order of severity:

  1. Poor performance
  2. No functional pager
  3. Terrible application awareness

Let me tackle each of these in turn.

Performance

The machine I replaced my old laptop with wasn’t top of the line. It was one of the rank and file laptops we had in stock, and I might be able to swap it to a beefier dev machine at some point soon. Still, the hardware was nothing to scoff at – a respectable Intel i5 CPU and 4GB of RAM. Nothing to write home about, but we usually run Win 7 with full-blown Areo effects on this hardware, and it handles it without breaking a sweat. Unity was making it work for its money: the fan was whining full speed most of the time, and launching applications would actually freeze up the desktop for a few seconds. Switching desktops was literally painful.

You could argue that it’s not Unity but Ubuntu itself being an unoptimized resource hog. I was concerned by this too, but then I decided to do an experiment and ran the following command:

sudo aptitude install gnome-panel

I then logged out, logged back in and all my performance issues went away. Applications were now launching normally, and switching desktops wen from 1 second lag, followed by jerky animation to an instantaneous flick.

It’s probably also worth mentioning that after trying Gnome Panel, I went back and booted up in the Unity 2D mode to see how it stacks up. It was a big improvement on the default setup and the machine was actually usable. So if you do want to give Unity a whirl, I highly recommend using the 2D mode unless you have a top of the line rig. Of course half of Unity’s charm is how pretty it looks so you will be getting a diluted experience. Along with my other issues, I decided it just wasn’t worth it.

Lack of Pager

Both KDE and Gnome has always had excellent pagers. You know – those little widgets that sit in your task bar and let you switch between your virtual desktops. I always loved the fact that I could just glance onto said pager in the corner of my screen and see the rough layout of my windows on each desktop. Not only that, but I could just click on any arbitrary desktop to switch to it, or grab any of said windows and drag it to a different desktop letting me organize my shit without any hassle. In Unity that functionality is replaced with OSX inspired “workspace switcher” which does the pan-and-zoom-out kind of animation every time you activate it. It’s icon in the dock is static and doesn’t give you that at-a-glance preview which seems like a downgrade.

Gnome Panel

Note the pager in bottom right which shows you what windows are open on which desktop/display and allows you to drag and drop them.

I liked seeing window outlines in my KDE/Gnome pager. I liked being able to drag windows between desktops without actually switching to those desktops via the pager. The pager was always one of the best features of a Linux desktop. I always miss it whenever I’m working on Windows of OSX. In the past I have tried third party solutions that would add virtual desktops to Windows and pager like functionality of OSX but none of these have ever worked as well as the native KDE/Gnome task-bar widgets.

One could argue that removing the pager in not unorthodox because it mimics OSX behavior that users might be more familiar with. The fact that it was aped from Apple however doesn’t make it good. Personally I am not a huge fan of OSX spaces. I’m glad the functionality exists, but I often find myself wishing it worked more like traditional Gnome/KDE style virtual desktops. It is admirable that Cannonical is trying to learn from the best, but I think in this case they got it wrong. They took something that was working well, and replaced it with something similar but less functional.

Application Awareness

Unity is also missing a task bar. I don’t know about you but I like task bars. I’m very fond of the traditional, one entry per window task management. One of the first things I do when I install Windows is to disable collapsing in the task bar. I like to be able to see how many windows I have open, and be able to switch between them easily with a mouse. Collapsing applications into a single icon on the task bar hides vital information that I need to work efficiently.

OSX doesn’t have a traditional task bar, but it provides some alternatives. When you minimize windows in OSX, you essentially iconify them into custom doc entries. So as long as you command-M your windows instead of letting them go out of focus, you have yourself a functional task bar with one entry per minimized window.

Unity implementation combines worst features from both worlds and ends up with Launcher which is barely workable. Instead of showing you what windows are open on current desktop, Unity adds dots to the left of the icons of apps. One dot per open window on the current desktop… But only up to three dots in total for some reason. Instead of collapsing minimized windows into their own entries it simply hides them. So the only way to find out you have minimized windows on current desktop is to count the dots, then count open windows and subtract.

Unity Desktop

Pop Quiz: How many instances of Terminal are running here? The answer is five.

You can of course trigger Apple-expose like windows splay by clicking on said dots, but that shows you a preview of all the windows, without differentiating which ones were open, which were in focus and which were minimized. It seems a very haphazard, unfocused take on window management. With a traditional task-bar I can glance on the bottom of my screen and I instantly know how many windowed applications I’m running and on which desktops I have open windows. With Unity I always felt like I was flying blind, never having full awareness of my work environment.

Conclusion

In my honest opinion, Unity is mostly broken by design. I don’t blame Cannonical for trying to design a desktop environment that is easy to use and intuitive. I don’t blame them for abstracting and hiding away a lot of configuration details in lieu of a streamlined and unified look and feel. I don’t mind that they came up with an opinionated desktop environment that makes bold design choices. This is actually a good thing. Linux desktop needs this sort of focus on usability and user friendly environments. I’m glad that Unity exists because it allows us to have discussions about usability and user centric design on Linux desktop – something that used to be an almost foreign concept few years ago.

That said, when you come up with an opinionated framework that makes bold decisions, you risk using established power users. I am unfortunately one of them. I’ve been an Ubuntu user for many years now, but Unity is just not for me. I like traditional task-bars and pagers and I want them to be part of my desktop experience. If I didn’t like them I would probably be using a tiled window manager like real men are supposed to (or so I’m told).

That said, I can see how Unity could provide better user experience for novice users who have not yet developed habits such as juggling many virtual desktops and displays. If you are the type of person who usually runs one or two applications at a time, Unity 2D might actually be a viable option. The large icons on the launcher and search based application finder make it very easy for a Linux novice to find a program they might need for a particular task.

It’s a pity that Unity offers next to nothing to us power users.

]]>
http://www.terminally-incoherent.com/blog/2013/05/08/unity-is-not-great/feed/ 16
Ubuntu Disk Cleanup http://www.terminally-incoherent.com/blog/2012/11/10/ubuntu-disk-cleanup/ http://www.terminally-incoherent.com/blog/2012/11/10/ubuntu-disk-cleanup/#comments Sat, 10 Nov 2012 15:09:23 +0000 http://www.terminally-incoherent.com/blog/?p=12998 Continue reading ]]> This fine morning KDE greeted me with a particularly nasty warning:

Do you think I'm low on space?

Do you think I’m low on space?

Seems like it is time for some spring cleanup… And by spring, I of course mean winter. But where to start?

Well, the best place is usually to nuke your temp files. There are many ways to do this, but my go-to tool is Bleach Bit. It is a multi-platform tool that does a remarkably decent job sweeping up garbage that crufts up linux filesystems without actually doing any major damage. Also, it is quite fast.

Unfortunately in my case, it only managed to free up about 300MB of space. That’s certainly bigger than 89MB of free space I had previously, but still not great.

Here is a question: what does it mean when deleting temp files makes almost no difference with respect to unused space on your hard drive? It means that all the space was eaten up by your activity – files you downloaded, software you installed and etc. So the first thing to do is to clean up your home directory.

If you are like me, you probably have one or more “dump” folders where you just shove random files you don’t feel like filing anywhere else. Mine are:

  • ~/Downloads/ which is the default download folder in Chrome
  • ~/tmp/ which is where I dump random files, logs, and etc. For example, if I need to quickly photoshop a file and upload it to the internet for lulz, it goes into this directory.
  • ~/wkspc/ is a higher level temp dir where I put random tests and code snippets I don’t plan on saving

As a rule, it is usually safe for me to purge these directories outright when I’m running low on space. Whenever I find myself working on something inside ~/wkspc/ for more than a brief, fleeting instance, I usually move it to ~/projects/ and put it under source control. Everything else, is a fair game.

Sadly, nuking those folders gave me very meager results – probably because most garbage I download and generate on day to day basis is rather small. So where was all my space tied up? I decided to find out using du command:

du -sBM * | sort -nr

This will give you a nice list of folders ordered by size that looks a bit like this:

Find large folders using du

Find large folders using du

I actually took this screenshot after doing some cleanup, but you can sort of see how it works. The largest repositories of junk in my case are my Dropbox folder which I can’t really do much about, and my Desktop folder where I had a large directory of work related crap I did not want to get rid of. The rest of the directories looked fairly clean. And yet running df would show that / was 96% full.

Then I got another idea – my previous search explicitly excluded dot-files. So why not check them specifically:

du -sBM .* | sort -nr

Can you say jackpot?

Wat r u doin, VirtualBox? Stahp!

Wat r u doin, VirtualBox? Stahp!

VirtualBox directory has grew to be over 50GB large. Not good! I immediately sprung to action, deleted bunch of old snapshots, dumped unused VDI files to an external hard drive and went to town compacting the current ones. How do you compact them you ask? Using the friendly VBoxManage command:

VBoxManage modifyhd /path/to/your.vdi compact

Actually if your VM runs a Windows OS, should follow the advice listed here:

  1. Run the VM
  2. Defrag your disk once or twice
  3. Zero-out empty space with Sysinternals sdelete
  4. Then compact it using the command above

For me, this yielded about 10-20% savings per VDI file, which was not insignificant. But since I was already in cleanup mode I decided to keep going.

Having deleted most things in ~ that I was willing to part with, I turned to software. I can be pretty bad about installing software. Often I will download and install packages that I use once, and never touch again. I’m especially fond of downloading web browsers, IDE’s and scripting language run-times just to see how they feel. Half of these things don’t need to take up space on my hard drive.

So I decided to generate a list of installed packages and order it by size:

dpkg-query -Wf '${Installed-Size}\t${Package}\n' | sort -rn | less

Biggest offenders?

Use Tmux to make it easier

Use Tmux to make it easier

It seems that the biggest packages on my system were Haskell, Google Chrome, Open Office and… A shit-load of old kernel image files. See, this is the kind of thing that happens on Linux when you just let the OS upgrade itself whenever it wants. Every time there is a kernel security patch or an update apt leaves the old kernel version intact. This is good, because you can always revert to your old kernel version if the new one breaks everything and ruins your day. But after a while you get a long, long list of defunct kernel images and header files. You can actually see the entire list like this:

dpkg -l 'linux-*'

How do you clean this up? Well, you aptitude purge all the kernel files, except the current one. You can check what you are running right now via uname -r command. Then sit down, and prepare to type in a lot of package names…

Or use this script to generate a list of all installed kernel files, except the current one:

dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'

I can’t actually claim ownership of this one – this sed monstrosity was actually made by the folks at Ubuntu Genius blog. In fact, they went one step beyond showing you how to automatically purge these things in a single command:

dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d' | xargs sudo apt-get -y purge

In my case, the uninstallation took close to an hour to complete, and reclaimed over 8GB of space, without damaging anything important.

For a good measure I still went back and uninstalled useless things like extra web browsers, various ide’s, language runtimes and any GUI tools that haven’t been touched since I installed them. All in all I think this was a successful cleanup:

After the cleanup

After the cleanup

My / is now only at 81% with over 17G of free space. What do you think? Do you have any disk cleanup tips that you would like to share here? Let me know in the comments.

]]>
http://www.terminally-incoherent.com/blog/2012/11/10/ubuntu-disk-cleanup/feed/ 17
Chrome and Java plugin on Ubuntu 10.4 http://www.terminally-incoherent.com/blog/2012/02/15/chrome-and-java-plugin-on-ubuntu-10-4/ http://www.terminally-incoherent.com/blog/2012/02/15/chrome-and-java-plugin-on-ubuntu-10-4/#comments Wed, 15 Feb 2012 15:07:24 +0000 http://www.terminally-incoherent.com/blog/?p=11235 Continue reading ]]> In the last week or so the Java plugin on my laptop completely broke. I’m not sure what exactly has happened but it just stopped working. At first Chrome started complaining that my Java is out of date but I ignored it. Not because I felt like it, but because the latest and greatest release from sun has not yet hit 10.4 repositories. I was confident that it would get there eventually, so I just let it linger. Then it broke.

Why am I running 10.4 and not the latest release? Because it’s an LTS. It is still fully supported, it still gets security patches and it lets me ignore the upgrade threadmill. It’s not that I don’t like new shiny features – I just happen to have a finicky Nvidia video card that relies on proprietary drivers which may or may not need some time investment to get working properly. Same goes for my sound card. Almost every time I do a system upgrade I am left with 600×800 resolution and no sound until I figure out what broke this time. So at some point in the past I wisely decided that that laptop is going to be an LTS only machine.

I know, I know. Fuck Java. Fuck flash, fuck Java, fuck all them plugins with a broomstick! Let’s do everything in HTML5 and JavaScript. Honestly, I’m all for it. To me Java is mostly a back-end technology good for doing enterprisy stuff. It is not really that welcome on the client side of things because most of the time the native solution is going to be much more responsive, while Java windowing toolkits tend to be treated as second class citizens. You can do some cool client-side things with Java – like Minecraft for example. But for the most part, it is best on the back end.

Broken Java installation should not be a huge problem, but it just so happens that I often use that machine to maintain the local Barracuda SSLVPN which uses a two factor authentication scheme . The login page actually uses an applet to search for a key file in your file system. Yes, and applet. I know – it’s an abomination unto all that it’s holy. But the SSLVPN does have some really useful functionality. If I must use Java in order to get it, then so be it. You can imagine that having a broken Java was a bit of an inconvenience.

In about:plugins I had it listed as libnpjp2.so with the correct path:

/usr/lib/jvm/java-6-sun-1.6.0.26/jre/lib/i386/libnpjp2.so

There was no usual description, and no details associated with that entry. It looked strange and broken. I tried to uninstall and reinstal sun’s Java package via apt, but that just was not working. So I finally broke down and decided to manually install Java 1.6.0.30.

To keep things simple I decided to dump it into /opt because that’s where I keep things like that. I have Chrome living in /opt/google/chrome/ so it made sense to put my non-deb Java in /opt/java/. First, let’s create that folder:

sudo mkdir /opt/java

Next, we download the self extracting .bin file from java.com download page, move it to our new directory and extract it:

sudo cp jre1.6.0.30.bin /opt/java/
cd /opt/java/
sudo chmod +x jre1.6.0.30.bin
sudo ./jre1.6.0.30.bin
sudo rm jre1.6.0.30.bin

To ensure this new version is globally registered on the system we open the Java console:

ControlPanel
  1. Click on the Java Tab
  2. Click on View
  3. Click on Find
  4. Click Next
  5. Navigate to /opt/java/jre1.6.0.30
  6. Click Next
  7. Click Finish
  8. Uncheck all previous versions
  9. Click Ok
  10. Click Apply

The list in #8 should look something like this:

Registered versions of Java

Registered versions of Java

Next we need to link the plugin in the Chrome directory so that it can find it.

cd /opt/google/chrome

You should have a /opt/google/chrome/plugins folder here. For some reason, I didn’t. Not sure why, and how that happened since other plugins like flash were working fine. I ended up having to create it:

sudo mkdir plugins
cd plugins
sudo ln -s /opt/java/jre1.6.0_30/lib/i386/libnpjp2.so .

After this, Java started working again like a champ. If it does not work for you, you might need to change your chrome shortcut to run with –enable-plugins parameter. I already added that one a while ago, when I started getting the outdated plugin messages.

I’m putting this here mostly for my own reference, and for posterity. Perhaps this will help someone whose Java gets similarly broken. While these sort of posts are probably not all that interesting to regulars, they do sometimes bring in some Google traffic, and direct new potential readers to my blog.

I promise to post something more fun on Friday.

]]>
http://www.terminally-incoherent.com/blog/2012/02/15/chrome-and-java-plugin-on-ubuntu-10-4/feed/ 7
New Toy: Compaq Presario 1800 http://www.terminally-incoherent.com/blog/2010/06/08/new-toy-compaq-presario-1800/ http://www.terminally-incoherent.com/blog/2010/06/08/new-toy-compaq-presario-1800/#comments Tue, 08 Jun 2010 14:11:45 +0000 http://www.terminally-incoherent.com/blog/?p=5548 Continue reading ]]> I have inherited yet another old laptop. This time it is a lovely Compaq Presario 1800. If you have never actually seen one of these, here is how it looks like:

Compaq Presario 1800

I have no clue what is it with my relatives and Compaq machines. I it just goes to show that Compaq was once as dominant force on the market as Dell is today. Everyone seems to have an old Compaq or two somewhere in the attic or under their bed. It’s uncanny.

Presarion 1800 is actually a huge step up from the previous two junkers that I managed to run into the ground. If you remember correctly, my Presario 1240 had a rather interesting life running Damn Small Linux, Slax, Fluxbuntu and then vanilla Ubuntu Hardy with Ratpoison. This laptop is actually way ahead in terms of specs:

  • Intel Pentium III 696 MHz
  • 128 MB RAM
  • ATI Rage Mobility M3
  • 20 GB HD
  • Intel Pro/100 S Mobile LAN

The only bad thing about this machine is the fact that the battery is completely dead. So the only way it will even power up, is when you attach it to AC power. Still, compared to the TI Extensa Scholar ESS2 I was messing around not so long ago, this is like a race car. Btw, I eventually got a bare bones, CLI only linux system running on the Extensa, but it crapped out on my before I managed to get the networking figured out.

Presario 1800 Closed Lid

So far I haven’t done much with this machine. When I got it, it was running Windows XP. I don’t know how, but it was running it, and the performance wasn’t actually that bad. I t was sluggish, but workable. I decided to scrap it, and exchange it with something leaner and more fitting. And then I picked the most bloated and performance retarded distro ever: Xubuntu. I made that decision after seeing how XP was rather responsive on this system. I figured that if it can reliably run the bloated MS system, it should handle Xubuntu, no?

Well, it can but it ain’t pretty. First off, Xubuntu would not even boot into the live distro on this machine. This should have been a red flag, but I didn’t really pay it any attention. I used the alternative installer instead and managed to get the system running. The good news was that networking seemed to work out of the box. The machine didn’t have any on-board wifi card, and I didn’t really need one seeing how it needs to be tethered to power outlet at all times. The bad news was that it was running almost unbearably slowly.

Xubuntu has a nice polished UI but it is a power hungry abomination of a system. With only 128MB of RAM almost everything required swapping. Hell, moving my mouse a bit would usually cause grinding hard drive sounds to be heard. Launching Firefox was a 5 minute ordeal, during which the machine would almost shake from the frantic hard drive activity. It saddens me to say this, but if you have an older machine like this one, XP will actually perform better than Xubuntu. There is just nothing lean or low-end friendly about that distro – it is as bloated and resource crazy as the vanilla Gnome version. It’s just a distro for people who like Ubuntu but hate both Gnome and KDE.

One thing that I like about this system is the touch pad area:

Presario 1800 Touchpad Area

It has little built in LCD display which was supposed to make this machine into something like a CD player. It will actually display the track you are currently playing and etc. I’m not sure if I could get it working under Ubuntu, but it would be neat if I could take it over and use it to display some arbitrary messages if possible. This will require some research.

Btw, if you look at the two blue buttons next to the mouse keys – they are my favorite thing in this whole laptop. They are essentially a scroll-wheel replacement and they worked out of the box in Xubuntu. I was able to scroll web pages up and down with them which was amazing. I always hate using touchpads on old laptops which do not implement that whole “drag finger across the edge to scroll” feature. This is brilliant. All the other buttons, except for the volume control were useless of course.

My next step is probably going to be installing a command line only Ubuntu system, and some lighweight UI. I might go with Ratpoison again. I’m not sure yet. Does anyone have suggestions? What distro should I try next?

]]>
http://www.terminally-incoherent.com/blog/2010/06/08/new-toy-compaq-presario-1800/feed/ 13
Linux: how do I find the devce name of my USB drive? http://www.terminally-incoherent.com/blog/2009/12/28/linux-how-do-i-find-the-devce-name-of-my-usb-drive/ http://www.terminally-incoherent.com/blog/2009/12/28/linux-how-do-i-find-the-devce-name-of-my-usb-drive/#comments Mon, 28 Dec 2009 15:42:08 +0000 http://www.terminally-incoherent.com/blog/?p=4493 Continue reading ]]> Around the time I reviewed Chromium OS, I managed to totally b0rk one of my thumb drives. I somehow botched the dd command, and the device became unusable. When I plugged it in, nothing would happen. Or rather nothing on the UI side. My KDE would simply ignore the drive and pretend it was not there. I didn’t want to just throw out the USB stick, so I decided to figure out what device name gets assigned to it, and then repartition it again.

How do you do that? The simplest method is to watch the log files. When you plug in a USB the device, your system should make a note of it in /var/log/messages. So you should do the following:

tail -f /var/log/messages

In case you didn’t know, the tail command prints out the last few lines of a text file, and the -f argument basically means “follow”. So tail will basically pring any new lines that are appended into the console in real time. Once you issue this command, just plug in your device. Your output should look something like this:

Dec  1 12:56:44 malekith kernel: [13631.153753] usb 2-1: new high speed USB device using ehci_hcd and address 4
Dec  1 12:56:44 malekith kernel: [13631.288125] usb 2-1: configuration #1 chosen from 1 choice
Dec  1 12:56:44 malekith kernel: [13631.288669] scsi5 : SCSI emulation for USB Mass Storage devices
Dec  1 12:56:49 malekith kernel: [13636.295004] scsi 5:0:0:0: Direct-Access     Kingston DT 101 II        1.00 PQ: 0 ANSI: 2
Dec  1 12:56:49 malekith kernel: [13636.295900] sd 5:0:0:0: Attached scsi generic sg3 type 0
Dec  1 12:56:49 malekith kernel: [13636.306962] sd 5:0:0:0: [sdc] 7831552 512-byte logical blocks: (4.00 GB/3.73 GiB)
Dec  1 12:56:49 malekith kernel: [13636.308590] sd 5:0:0:0: [sdc] Write Protect is off
Dec  1 12:56:51 malekith kernel: [13636.315523]  sdc: sdc1
Dec  1 12:56:51 malekith kernel: [13637.905840] sd 5:0:0:0: [sdc] Attached SCSI removable disk

Check out the second to last line – it says sdc1. What does that mean? That means that my b0rken thumb drive is assigned to the /dev/sdc1 device. Now that I know that I can easily run fdsk on the device to format it and rebuild the partition table that was messed up by a botched dd command.

It’s that easy. I’m putting it here for future reference more than anything else.

]]>
http://www.terminally-incoherent.com/blog/2009/12/28/linux-how-do-i-find-the-devce-name-of-my-usb-drive/feed/ 1
Kubuntu 9.10 Upgrade: Karmic nVidia Failure http://www.terminally-incoherent.com/blog/2009/11/16/kubuntu-9-10-upgrade/ http://www.terminally-incoherent.com/blog/2009/11/16/kubuntu-9-10-upgrade/#comments Mon, 16 Nov 2009 15:37:59 +0000 http://www.terminally-incoherent.com/blog/?p=4333 Continue reading ]]> Did they name Ubunu 9.10 Karmic on purpose, and then had it ruin the lives of the wicked people? My upgrade was an absolute train wreck. I spent my whole afternoon, and evening fixing it, and managed to accidentally delete few moths of email. Yay me!

The upgrade went smoothly up until I rebooted the machine and noticed that I’m running 800×600 and my dual head setup was broken. This was very noticeable on a 23 inch monitor, and running the new KDE version which super-large windows decorations. So I decided to fix it.

Quick note on KDE 4.3:

WHAT THE FUCK?

It seems that the design goal of this release was to “make it as shitty as Vista”. Can we please stop doing that? Seriously, I don’t even recognize this environment anymore. It was working fine before – there was no need to change the Kmenu, the panel or the fucking desktop.

Granted, the desktop effects are actually very nice, and the plasma widgets are cool. Still, I wasn’t very happy viewing it in 800×600 resolution on a 23″ monitor. Try that yourself and you will see why I was angry. Without the bells and whistles the desktop was just ugly and barely functional. I’m slowly getting used to it now and I think I will be fine but the first impression was horrible.

So I did the exact same thing that worked for me last time. I pulled up the KDE Hardware app and told it to activate proprietary nVidia driver. It didn’t work. I tried couple more times, and then restarted the machine thinking that maybe the damn thing is just not registering the change. That’s how I hosed my X. Kubuntu came back in text only mode and I had to hack the xorg.conf and switch it back to the generic driver.

After this I tried following some online troubleshooting steps trying to install, re-install the drivers, hack the xorg.conf and each thing I did made my system more broken than it was. Eventually I managed to delete my .kde folder with several moths of emails (ie. my last backup was few months ago, and I have no one to blame for this but my own stupidity).

PROTIP:

Do not do mv -i as root. Ever!

In fact, every time you do any moving or deleting from the command line you should back up the folders in question just in case. The beauty of working from the shell is that it does not try to hold your hand or second guess your choices. Linux will do precisely what you ask it to do – whether it is good or bad for the system. This gives you great deal of power and flexibility but comes at a price – a typo, or badly formatted command may actually damage the system or wipe your files.

This is what happened to my email folder. At one point during the troubleshooting I got it in my head that something went wrong in my KDE setup. At that point I got my machine to display the log in screen, but X would crash when trying to actually load the environment. So I decided rename my .kde directory and let the system generate a clean one to see if this helps. It did not, so I renamed it back. About 2 hours later I realized that I must have made a typo of some sort. When I finally got KDE to load it completely forgot all my settings. I went searching for the .kde-bak directory I created earlier, but it was nowhere to be found. It just went *poof*.

It wouldn’t be that bad if it wasn’t for the fact that Kontact keeps it’s email diles in there. Oops… I had backups, of course, but unfortunately I have been rather lax about them in the last few… um… months. So yeah – you get the idea. I was not a happy camper and there was no one I could blame for this but myself. First for being reckless with my commands. Second for not making a copy prior to fucking with such a crucial directory. Third for getting complacent and not running the backup script in god knows how long.

To make a long story short, half the solutions posted in the Ubuntu forums are total crap. It became painfully obvious that my problem ran much deeper. Reinstalling the drivers and re-creating the x config just wouldn’t cut it.

For reference my machine is a Latitude D820 with nVidia Quadro NVS 140m board. I was starting to think that there is just no working driver for this card that is compatible with the 2.6.31-14-generic kernel. Finally, after several hours I found the solution.

Alexander V. Røyne-Helgesen deserves one free internet for figuring this out. His fix is the only thing that worked for me. In case you are to lazy to click on the link, here is the solution:

First, open up your /etc/modprobe.d/lrm-video file and comment out every single entry that references nvidia. Your file should look something like this:

 # Make nvidia/nvidia_legacy and fglrx use /sbin/lrm-video to load  
install fglrx /sbin/lrm-video fglrx $CMDLINE_OPTS  
#install nvidia /sbin/lrm-video nvidia $CMDLINE_OPTS  
#install nvidia_legacy /sbin/lrm-video nvidia_legacy $CMDLINE_OPTS  
#install nvidia_new /sbin/lrm-video nvidia_new $CMDLINE_OPTS'

Once this is done, go to your /etc/modules file and add this at the end:

nvidia

Finally, go to your xorg.conf, find the entry that describes your video card and change the driver to nvidia. It should look something like this:

Section "Device"
   Identifier      "NVIDIA Corporation NV40m [Quadro NVS 140m]"
   Driver          "nvidia"
   # more lines here...

Now restart thy X server and… Boom! Back in business.

I should probably mention that I uninstalled and reinstalled the nVidia drivers about 10 times during the whole ordeal. I used various sources. The last thing I tried was the EnvyNG script (the package name is envyng – it’s in the repos). So I can confirm that this method above works with Quadro NVS 140m with a driver installed by EnvyNG. May not work after a straight upgrade.

Did I mention that the upgrade also broke my VirtualBox installation? Yeah, it did, but that’s a topic for a whole other post. Needless to say, I am never doing this sort of thing again on a weekday.

]]>
http://www.terminally-incoherent.com/blog/2009/11/16/kubuntu-9-10-upgrade/feed/ 14
You don’t need to convert them… http://www.terminally-incoherent.com/blog/2009/11/05/you-dont-need-to-convert-them/ http://www.terminally-incoherent.com/blog/2009/11/05/you-dont-need-to-convert-them/#comments Thu, 05 Nov 2009 14:52:03 +0000 http://www.terminally-incoherent.com/blog/?p=4114 Continue reading ]]> Recently a friend of mine approached me with a weird question: how to install Windows on a machine without a CD or Floppy drive. I was intrigued. The obvious question here was “why?” It turned out that he just ordered himself one of those Dell Mini notebooks. Naturally, like every living being on planet Earth my friend hates Windows Vista with a passion and as a result he didn’t really feel like paying a Vista tax. So he opted for the Ubuntu version with the intention of installing his copy of Windows XP on the machine. But while he was in the cost saving mode, he also decided not to purchase the optional external CD drive.

Oops… That CD drive was sort of there for a reason. He realized that after it was too late to do anything. So now he basically wanted to know if it’s possible to install XP from a USB thumb drive. Can it be done? Apparently, yes it can. All you need to do, is google for it.

But at the time we were having this conversation I gave him a benefit of the doubt and assumed he already did search it online, and came up empty. And since I didn’t know the solution of the top of my head, I got a crazy idea.

“Why don’t you just keep Ubuntu?” I asked him.

I mean, it’s a Dell Mini with a 16GB solid state drive and a tiny ass screen – he is definitely not going to use that machine for gaming, photoshop or other Windows centric stuff like that. The machine will likely be used as a hardware extension of a web browser. The OS is mostly an overhead on a machine like that.

Now, I’m not a linux evangelist. I don’t go around telling people to switch to Linux. I honestly can’t do that anymore because I know that my experience with the OS is irrelevant. I am a computer geek a software developer and linux enthusiast. This makes so far removed from the general population, that I can hardly relate to your average Windows user.

Nevertheless I did my best to give him a quick pitch on how the OS will be mostly irrelevant on that machine. And it will run most of his favorite apps – like firefox for example.

“Will it run Chrome?” he demanded.

Of course it will run chrome. Then again last time I used Linux version of Chrome Flash didn’t work yet, but they fucking update it daily. I quickly launched my copy of the browser to check, and lo – it was running flash quite flawlessly now.

My friend was not fully convinced yet. He started asking me about opening Word documents so I pulled up Open Office and illustrated how it works. Then I quickly downloaded and burned him a copy of the Gnome based Jaunty to show him what the OS that ships with his Mini will look like (I’m running Kubuntu on my laptop, and it looks quite different).

I booted it on his laptop, and he was blown away when he realized you can actually run a fully functional OS from the CD like that.

“But how can it do that?”

Well, because it’s really not that impressive. I mean, it doesn’t really matter whether your OS binaries are on the HD or on some other media. You have to load them into memory before they get executed anyway – so where they are originally is irrelevant. There is really no reason why Windows couldn’t have a Live CD version. In fact, you can easily make one with BartPE.

He was also enamored with virtual desktops. “It’s like tabs for your desktop” he said. I never thought about them like that, but yes – that’s a valid analogy. That’s technically how these things work.

He was also amazed on how many “features” were included in the OS itself. I had to explain that most of the applications he saw there were really stand alone open source projects – but by virtue of being free software they could be included in the free OS.

To make this long story short, my friend decided to keep Ubuntu on that machine – at least for now. In fact, he said he might replace it with the Notebook Remix version he found online so he can be running Jaunty (the Dell ships with Intrepid if I’m not mistaken). I told him that if he tries Ubuntu can’t deal with it, I’ll be happy to help him with the XP installation hack. He nodded, but I’ve seen that gleam in his eye that told me it won’t be needed. I think our little community might have a brand new member.

Now, I’m fairly sure my friend will continue using Windows. I didn’t “convert” him and made him into an exclusive Linux user. But he will give Ubuntu a try, and hopefully will like it becoming an OS agnostic nut bag like me. And that’s more than I could ever ask for.

We really don’t need to convert people, or try to ween them off of Windows. All we need to do is to show them the alternatives and find places in which they work well – like mini notebooks for example. This will have far reaching effects. For one, they will no longer automatically assume that OS == Windows. They will see that there are different operating systems that can be used for different purposes. Secondly, they will be now able to call MS on their bullshit as they will see that things can be done differently in the open source world. Thirdly… Well, they will be using linux. The more of us are there, the better. I don’t care if he still uses Windows on the other machine – he still counts as one of us.

]]>
http://www.terminally-incoherent.com/blog/2009/11/05/you-dont-need-to-convert-them/feed/ 17
Ubuntu: Change Sensitivity of the Synaptics Touchpad http://www.terminally-incoherent.com/blog/2008/10/28/ubuntu-change-sensitivity-of-the-synaptics-touchpad/ http://www.terminally-incoherent.com/blog/2008/10/28/ubuntu-change-sensitivity-of-the-synaptics-touchpad/#comments Tue, 28 Oct 2008 14:38:28 +0000 http://www.terminally-incoherent.com/blog/2008/10/28/ubuntu-change-sensitivity-of-the-synaptics-touchpad/ Continue reading ]]> I hardly ever use the touchpad on my laptop. At work, my morning routine is plugging in my external monitor, ethernet cable and the USB hub into the back of my machine. Yes, I could get a docking station but why bother? I have a little USB hub on my desk where I connect my mouse, keyboard, the external drive for backups and occasionally a flash drive or two. It is almost like a desktop replacement which it practically is. Dell Latitude 830 is a monster of a laptop. I love this machine but it is big and bulky and definitely designed to be stationary more than portable.

Today I had the crazy idea of walking around with it and using it as a normal person would use a laptop. Bad idea! It is nice to have that big wide screen when you work on this machine but it really was quite unwieldy when I was trying to carry it and a stack of papers around the building. Not to mention that the suspend to disk just does not work on that machine. Not that I’m surprised. I have never owned, nor seen a Linux laptop in which ACPI functions such as suspend or hibernate would work with any degree of reliability. If you have one, congratulations! I envy you. Perhaps Hardy will solve my issues once I finally upgrade to it. But I digress…

I took the laptop with me to the classroom without an external mouse and noticed two things. One, my keyboard was dusty showing how often I actually use this machine as a laptop. Two, my touchpad was sluggish. Quick glance at the KDE System Settings panel assured me that there was no such thing as touchpad settings applet. One was clear – I had to do something. I tried using the rubber nipple (yes this a technical term) located between my G and H keys but that thing is so inaccurate it is not even funny. It is like trying to mouse around with a Joystick – something that I actually did quite a few times back on Amiga when I was to lazy to plug in a mouse in between games. It will get things done, but it is neither pleasurable nor productive.

So I decided to fix this. Quick google told me that all I really needed to do was to add few short lines to xorg.conf. Look for the following section in your file:

Section "InputDevice"
    Identifier     "Synaptics Touchpad"
    Driver         "synaptics"
    Option         "SendCoreEvents" 	"true"
    Option         "Device" 		"/dev/psaux"
    Option         "Protocol" 		"auto-dev"
    Option         "HorizEdgeScroll" 	"0"
    Option	     "MinSpeed"		"1.0"
    Option	     "MaxSpeed"		"1.8"
    Option	     "AccelFactor"	"0.3"
    Option	     "MaxTapTime"	"0"
EndSection

This is how mine looks right now, which is after applying the changes. You see, I added the MinSpeed, MaxSpeed and AccelFactor options to this section. You might need to play around with the numbers but keep in mind that the higher the MaxSpeed the less control you have over the cursor. At 1.8 my touchpad is a bit jumpy but I can swipe it from corner to corner of my screen without picking up my finger which is what I wanted. I’d say that 1.5 would be a medium speed you’d want to aim for, and 2.0 is way to fast. I haven’t experimented with acceleration much because I got tired of restarting my X.

Here is the thing – why can’t I have an applet with adjustable sliders for all of this in my System Settings area in KDE? It would be much easier and more convenient than editing xorg.conf and restarting X, don’t you think?

In case you noticed the last option MaxTapTime being set to 0, that is me disabling the tap to click functionality. Why? Because it was just to sensitive. I was sitting in the class as my students were taking an exam and readig Terminally Incoherent comments. At one point I was trying to move my muse pointer and I inadvertently clicked on one of the google video ads that sometimes show up above or below the comment box and my laptop went:

“DUM DUM DUM DUM! THE ICREDIBLE HULK! CRAAAAASH! ROOOOAR! COMMING SOON ON DVD! BA DUM DUM DUM! WHOOSH! KABLOOM!”

By that time I of course scrolled up so I didn’t see the video playing. I was just like “WTF??? Who is watching videos during an exam”. Then I realized it was me. Fun times.

So yeah, tap to click is gonzo for now. I don’t really need it and it was more annoying than useful.

]]>
http://www.terminally-incoherent.com/blog/2008/10/28/ubuntu-change-sensitivity-of-the-synaptics-touchpad/feed/ 4
Firefox 3 Thumb Button brings up Save As dialog http://www.terminally-incoherent.com/blog/2008/10/02/firefox-3-thumb-button-brings-up-save-as-dialog/ http://www.terminally-incoherent.com/blog/2008/10/02/firefox-3-thumb-button-brings-up-save-as-dialog/#comments Thu, 02 Oct 2008 15:07:14 +0000 http://www.terminally-incoherent.com/blog/2008/10/02/firefox-3-thumb-button-brings-up-save-as-dialog/ Continue reading ]]> This has been driving me nuts since I have upgraded to Firefox 3.0.1 on my laptop like a month ago. I’m still running Gutsy on that thing, and only the beta version (which crashes a lot) is in the repositories. So I pretty much did the same thing as back when FF 2.0 came out and I was running dapper. I snagged the statically linked linux binary from the website, dumped it into /opt/firefox and un-installed the old version. It works fine with a single exception – my thumb button was acting weird as hell.

This requires some explaining. I’m using Logitech VX Revolution wireless mouse and the btnx to detect and remap all the additional buttons on the mouse. I configured it so that the back and forward thumb buttons are set to emulate the Alt+ left arrow key and Alt+ right arrow key. These are of course Firefox shortcuts for Back and Forward controls.

To tell you the truth, I actually forgot how to press the back and forward buttons on the FF chrome. I just never do it. At home I am using MS Sidewinder mouse which also has convenient thumb buttons. I always use them while browsing because it is just so much faster than anything else. Faster than keyboard shortcuts because when I’m browsing one of my hands is usually holding the mouse. So thumb buttons are perfect for quick flicking back and forward between pages and I miss them when I’m forced to use a mouse which doesn’t have them.

Ever since I started using FF 3.0.1 on my laptop the back thumb button started doing something weird. In addition to sending the browser the “Back” signal it would also invoke the “Save As” dialog. Yes, it would just pop up in the middle of the screen. Attempting to use the dialog would be futile however. I tried to actually save whatever it was asking me to save several times, but Firefox never actually produced any files as a result of using this particular dialog.

It was nothing more than a constant annoyance. Every time I tried to go back a page, I would have to cancel this dialog. It was actually more annoying than clicking the buttons manually, or taking my hand off the mouse do do the Alt+ Left Arrow thing. Since then I have reconfigured btnx dozens of times, and kept searching Google for a possible solution. No luck. Eventually I figured out this had to be some Firefox setting and started digging in about:config.

Through bunch of trial and error tests, and good amount of luck I finally managed to identify the culprit. If you have this problem, all you have to do is to set:

middlemouse.contentLoadURL = false

From mozilla knowledge base:

Background:

This preference determines how to handle middle clicks in the content area. It was split off from middlemouse.paste, which now handles middle clicks in text fields only.

Possible values and their effects:

True: Load the contents of the clipboard as a URL. (Default for Linux/Unix.)

False: Handle middle-clicks normally. (Default for all but Linux/Unix.)

I actually never, ever use this feature because 90% of the time the thing in my clipboard is not a loadable URL. Besides, since I switch between platforms a lot, I generally don’t get used to platform specific features like that.

I have no clue why this particular setting kept picking up the Back Thumb button as some sort of “Save As” invocation. Note that this was not happening in FF 2.x, so go figure. Setting it to false solved the issue for me. I can browse like a normal person again.

Hopefully this post will help some other poor soul frustrated by this issue. :)

]]>
http://www.terminally-incoherent.com/blog/2008/10/02/firefox-3-thumb-button-brings-up-save-as-dialog/feed/ 1
Easy Way to Create Simple Linux Packages http://www.terminally-incoherent.com/blog/2008/09/23/easy-way-to-create-simple-linux-packages/ http://www.terminally-incoherent.com/blog/2008/09/23/easy-way-to-create-simple-linux-packages/#comments Tue, 23 Sep 2008 15:31:23 +0000 http://www.terminally-incoherent.com/blog/2008/09/23/easy-way-to-create-simple-linux-packages/ Continue reading ]]> I just figured out how to create an installation package in any of the popular formats (deb, rpm, etc..) in under 30 seconds. This method is probably not something you’d want to use for a serious project, but it is perfect for small scale things such as shell scripts, or various perl/python/ruby concoctions you want to distribute.

Before i start, I must confess that I never really made a deb package from scratch. I did create debs before with stuff like checkinstall. For example I do it every time I install ffmpeg on one of my machines because for some reason that package is horribly, horribly broken in the repos and half the features are disabled. If you want a working copy, you have to grab the source and roll up a deb yourself.

I never created a deb for one of my own scripts because I never needed too. Most of the time stuff that I write ends up being a single script or an executable, which I stick in /usr/local/bin or just keep it in the home directory. If I distribute it, I always figured someone else could do exactly the same – grab the binary and stick it somewhere in the path.

But the other day I was like “maybe I’ll just make a deb for this one script here since I already have like a whole project page for it. And so I googled “How to make a deb” and got tons of excellent tutorials, each of which was at least 50 pages long. I figured I was doing something wrong because a simple thing like creating a package can’t be that complicated and the creators of these extremely detailed howto pages must simply be suffering from the common case of verbal diarrhea which seems to plague at least every third linux user.

I mean, it took me 10 minutes to write and debug this script. If wrapping it inside of a deb takes 3 hours, then we are in trouble. Fortunately I’m a very lazy individual, and instead of trying to follow one of these gargantuan howto articles, I decided to find a quicker way and installed the EPM package:

sudo aptitude install epm

EPM basically builds packages for you almost automatically. All you need to do in terms of setup is to create a single .list file in the same directory as your project. For example for Twimi I created the following file:

%product twimi
%copyright 2008 by Lukasz Grzegorz Maciak
%vendor Lukasz Grzegorz Maciak
%description A minimalistic, command line Twitter updater.
%version 0.4
%readme README
%license LICENSE
%requires curl

f 755 root sys /usr/bin/twimi twimi

I think the above is pretty much self explanatory. The first 8 lines are metadata which will be embedded in the deb – you know, the stuff that you can read when you do aptitude show. The last line specifies what to do with the project files during installation. The syntax for this command is pretty much this:

f  mode  user  group  destination  source

You can find more info about other prefixes (there is c for configuration files, d for creating directories and etc..) by running man epm.list. All I needed was to copy a single shell script to some directory in your path, and then make it executable, and that was accomplished with the single line above. No need for any other tinkering. I saved the file as twimi.list and created README and LICENSE files because apparently epm expects them. You can leave them empty, but they need to be there for some reason.

Once you have all of this set up, you can create a deb by running the following command (where you’d replace twimi with the name of your project naturally):

sudo epm -f deb twimi

KABLAM! The deb will magically appear in a subdirectory named after your platform and architecture – for me it was linux-2.6-intel. Added benefit is that you can use the same .list file to generate other types of packages. Observe:

sudo epm -f rpm twimi

In addition to basic linux packages can apparently also make osx and bsd ones as well – but you will need the prerequisite package management tools for those systems installed. So I couldn’t really create an osx package (and I don’t own a mac so I don’t know how would I test it), but the option is there if you need it.

Undoubtedly someone will probably tell me there is an easier and more straightforward way to do this kind of stuff. This method worked for me, but if there is another more proper, and equally straightforward way I’d love to hear about it.

]]>
http://www.terminally-incoherent.com/blog/2008/09/23/easy-way-to-create-simple-linux-packages/feed/ 2