windows – Terminally Incoherent http://www.terminally-incoherent.com/blog I will not fix your computer. Wed, 05 Jan 2022 03:54:09 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.26 Powershell http://www.terminally-incoherent.com/blog/2013/10/14/powershell/ http://www.terminally-incoherent.com/blog/2013/10/14/powershell/#comments Mon, 14 Oct 2013 14:08:48 +0000 http://www.terminally-incoherent.com/blog/?p=15715 Continue reading ]]> As you may have heard already, my desktop is back to working condition. I decided to turn my hardware failure into a positive thing and take the time to upgrade my desktop. Not only did it get a new and shiny video card (nVidia GeForce GTX 760 – I kinda wanted the 780 but I would have to replace my PSU) but also a secondary hard drive on which I installed Windows 7. My rig was running Vista up until that point mostly due to the fact I was way to lazy to perform the upgrade. I figured that I would probably never get a better opportunity to do a clean OS install on this machine.

As a result the last week or so I have been running with a reduced set of tools on my machine. For example, the first few days I did not have Cygwin on board because I didn’t feel like going through the installation process which requires you to manually pick packages from a long list (that or install bare bones setup that typically doesn’t even include ssh). So as I was installing my regular programming environments (Ruby, Python, Node) I needed a working command line client. I typically use Cygwin for most things, and then cmd.exe when I’m in a pinch. The problem with cmd is that it is very, very limited in what it can do, and scripting in it is cumbersome. Bash can be quirky as a thousand fucks, but compared to cmd shell it is a walk in the park.

Windows does however have a proper shell these days. It is called Powershell and it was designed specifically to provide Windows admins with a Unix style experience. Let’s face it – while Cygwin is great for getting shit done it is not the best tool for system administration. That’s not necessarily the fault of Cygwin but rather design philosophy difference between different platforms. POSIX systems are designed around flat files. On any Unix-like systems config files are plain text, and so are log files. In fact even system and process info is accessible as flat files in the /proc directory (except Macs – Macs don’t proc). As a result most of the Unix tools used by admins evolved towards being great at parsing, processing and editing text files. On Windows on the other hand almost all admin-relevant data is encapsulated in some sort of object stores. Configuration is sealed in the Registry hives, system info is hiding in WMI store and logs are stored as Event Viewer blobs. All these constructs are proprietary Microsoft inventions and can’t be parsed using standard POSIX tools. As a result a Unix transplant system such as Cygwin can only be a second class citizen in the Windows universe.

Powershell was designed to be cmd.exe replacement that works with Windows rather than battling against it. It provides easy access to all the underlying data structures that are foreign to Unix tools, and much too complex for it’s simplistic, DOS derived predecessor. Not only that, but it also throws a bone to anyone coming from Unix background by aliasing a lot of its internal functions to classic short POSIX commands. So, for example, if you type in ls at the prompt it will work exactly as expected.

Let me give you a quick example just how neat Powershell can be sometimes. Whenever I build Win Boxen for work purposes, I like to change their Windows name to their vendor service tag / serial number because this helps a great deal with asset tracking automation and the like. How difficult is that to do in Powershell? Super easy. It can be done in 3 lines like this:

$serial = (gwmi Win32_SystemEnclosure).SerialNumber
(gwmi Win32_ComputerSystem).Rename($serial)
Restart-Computer -Force

So what is the problem with Powershell? How come people still bother cmd.exe batch files or VBS deployment scripts? How come Powershell did not become the one and only Windows shell?

Well, in its infinite wisdom Microsoft decided to cripple Powershell long before it could ever become popular. By default it starts with restricted execution policy. This means, no scripts of any kind will ever be allowed to run on your machine. When you double click a .ps1 file it throws an error. When you try to invoke a script you wrote yourself from the command line, it will puke half a page of errors at you before it crashes and burns.

  1. Design powerful new shell that can effortlessly hook into all the subsystems and data stores a Windows admin could ever need.
  2. Make all the commands have long-form aliases that are conducive to scripting (yielding clean and readable code) such as cd being just an alias of Set-Location.
  3. Install this shell on all modern Windows versions
  4. Register .ps1 as the executable Powershell script file type so that anyone can double-click these files to run scripts.
  5. Disable running of all scripts by default.

What the fuck?

Seriously, what the fuck happened there? What was the reasoning behind it? This is like creating an image editing software and disabling editing of images by default.

Actually, lets back up. Powershell is not new. It has existed since Windows XP days, back when Microsoft’s approach to system security was more or less “lol, buy a Mac”. They of course rescinded on that policy as soon as OSX came out and people figured out that Macs were actually a viable option. Today we exist in a world in which Windows actually ships with somewhat sane security setup. This was not the case in XP era when you couldn’t let a machine touch any networks until it was fully patched and bristling with at least 3 brands of security software. Back then the engineers saw the .ps1 file format and went great, yet another malware delivery vector. So they did the best thing they could: they plugged that hole before it became a serious security threat.

Naturally this ended up being only a half measure, because you can still easily trick people into running Powershell scripts by asking them to copy and paste code into the command line window. For example, this is how you install Chockolatey. Granted, this requires slightly more social engineering than just giving someone a script renamed to appear as porn.

Which is why, we don’t really have Powershell based viruses out there. Powershell was forever enshrined as the scripting language created by sysadmins and for sysadmins to do some admin stuff, but only if it does not involve users in any way. Why? Because to enable scripting you need to have admin privileges on the machine. Which is something sysadmins usually do, and end users they support do not. Which means that you can’t just write a shell script and give it to users, but you might be able to deploy them to deploy something across the domain.

If you are planning to use Powershell as a cmd.exe replacement, the first thing you need to do is to change the execution policy to enable scripting. To do that, you need to run Powershell as Admin and then execute the following command:

Set-ExecutionPolicy RemoteSigned

From that point on, scripts you write yourself or download from the internet will run as expected. The second thing you probably want to do is to create a profile. In Powershell, profiles work exactly the same way Unix profiles do. It is a script that gets automatically executed whenever you launch a new shell. The default path to your profile is kept in the $profile environment variable. That path is always there, but typically the file itself won’t exist unless you create it yourself. This can be easily done from the command line like this:

new-item -path $profile -itemtype file -force

At that point you can open it up in your favorite text editor like so:

notepad $profile

Substitute your preferred editor for notepad, but only if it is in the path. What goes in the profile? One thing you should probably consider putting there is a fancier shell prompt. Other things could for example be aliases. Here is mine:

function prompt
{
    # Check for Administrator elevation
    $w=[System.Security.Principal.WindowsIdentity]::GetCurrent()
    $p=new-object System.Security.Principal.WindowsPrincipal($w)
    $a=[System.Security.Principal.WindowsBuiltInRole]::Administrator
    $isAdmin=$p.IsInRole($a)

    if ($isAdmin) 
    {
        Write-Host "ADMIN" -NoNewLine -ForegroundColor White -BackgroundColor Red
        Write-Host " " -NoNewLine
    }

    Write-Host $ENV:USERNAME -NoNewLine -ForegroundColor Green
    Write-Host " @ " -NoNewLine
    Write-Host $ENV:COMPUTERNAME -NoNewLine -ForegroundColor Yellow
    Write-Host ": " -NoNewLine
    Write-Host $(get-location) -NoNewLine -ForegroundColor Magenta

    Write-Host " >" -NoNewLine
    return " "
}

set-alias gvim "C:\Program Files (x86)\Vim\vim74\gvim.exe"
function g { gvim --remote-silent $args }
function gd { gvim --servername DEV $args }

To change the command prompt you simply define a prompt function. The only caveat here is that it must return something that is non-zero and not an empty string. If you omit the return statement or return zero or blank string Powershell will simply append PS> to whatever was there. Other than that you can echo-out just about anything.

As you can see above, I’m using a unix-style prompt with my username, host name and path. If Powershell is called with elevated rights, there is also big, red, honking “ADMIN” tag up front to let me know I’m in the danger zone. Once you launch it, it looks more or less like this:

Powershell

My Powershell Prompt

And yes, my desktop is Gandalf, and my laptop is Balrog and they are usually on the same desk. I do realize I’m probably courting a disaster with this naming scheme.

Naturally, Powershell is not a replacement for good old Cygwin. For one, Cygwin provides me with native ssh and scp whereas with Powershell I have to use PuTTY and similar tools as proxies. Well, that and I don’t think I could ever wean myself from basic POSIX tools. Especially I tend to jump between Ubuntu, OSX and Win7 all the time.

Do you use Powershell on any Windows boxes? Do you have any nifty tips or tricks? What is in your Powershell profile right now? Let me know in the comments.

]]>
http://www.terminally-incoherent.com/blog/2013/10/14/powershell/feed/ 7
Unix ANSI colors and Windows colors in the same terminal http://www.terminally-incoherent.com/blog/2013/07/31/unix-ansi-colors-and-windows-colors-in-the-same-terminal/ http://www.terminally-incoherent.com/blog/2013/07/31/unix-ansi-colors-and-windows-colors-in-the-same-terminal/#comments Wed, 31 Jul 2013 14:02:43 +0000 http://www.terminally-incoherent.com/blog/?p=15329 Continue reading ]]> My windows setup is weird. Well, to be honest windows itself is a weird OS, but I make it even weirder by using Cygwin as my primary shell over there. This is not a perfect solution. In fact it is downright lousy considering that Cygwin (as wonderful as it is) is just a hackish attempt to port entire unix stack onto windows which doesn’t actually make it a good windows shell. It just gives you a unix shell that works on the platform but mostly just exists apart and besides windows and for better or worse is kind of a second class citizen.

If you are primarily a Windows user and not a filthy OS nomad like me, then using Cygwin is probably counter-productive. I’d highly recommend making Powershell your primary shell instead. Why? Because it is a first class citizen, and the proper admin tool for the OS. There is a great article somewhere by one of the engineers who developed Powershell which I never bookmarked – but trust me it exists somewhere. It touched upon the genesis of the project and how it grew out of the desire to have Unix like environment on the Microsoft platform.

You see, Unix shell is great. Everyone loves that environment, including Microsoft engineers. They liked it so much they actually made a valiant attempt to port a good chunk of the standard POSIX toolkit to Windows and they bundled them piecemeal in several Windows Server Resource Kits over time. Unfortunately they quickly realized that standard Unix tools are mostly geared towards editing and processing text files, because Unix keeps most of it’s configuration in plain text format. Porting these tools to windows doesn’t help administrators much, because they don’t deal with text files most of the time, but with weird constructs like the Registry. So while having a full POSIX compatible stack would be great, admins didn’t necessarily need it to do their job. What they needed was a shell that would give you the flexibility and robustness of unix command line environments (ie. not cmd.exe) but be able to juggle and manipulate windows internals as easily as unix tools manipulate text files. That’s how they built Powershel and that’s why you should be using it.

But to me it holds little of value because I primarily use Windows for gaming. If I do something serious on it, like programming for example, then I’d rather have it behave like unix. Which is what makes Cygwin a perfect tool for me. It allows me to use the same dot files and scripts across the board on all my machines.

Cygwin, being what it is, is not always perfect. A lot of tools I have on other platforms simply haven’t been ported to Cygwin environment and probably never will be. For example, Pandoc does not run under cygwin. It does however run under windows and you can actually call it from within the Cygwin bash shell as long as you are mindful of the paths (Cygwin does this thing where it has it’s own internal filesystem, and your Windows drives are mounted in /cygdrive durectory).

Another tool I use this way on windows is Git. When I first set it up on windows, I went with the MySysGit version rather than the Cygwin one. My reasoning was that I wanted it available natively in the OS to tools that may want to use it. When I later installed it natively in Cygwin I noticed that having two slightly different versions of Git on your system doesn’t work very well. So I decided to keep the native Windows version and just call it a day.

I use MinTTY as my primary terminal, and I noticed that whenever I called git from it I didn’t have nice colorized output I’m used to. This is mostly a MinTTY issue because it was primarily designed to be a unix shell running on Windows and as such it supports the ANSI escape codes but not Windows escape codes.

The Color Problem

This is the problem I’m describing – note the lack of color in git output.

The easy solution to this problem is to switch to the default Cygwin shell which is essentially the standard windows terminal with a back-hack that lets it render the unix ANSI color-codes. But I refuse to do that because that terminal blows. I searched long and hard for a MinTTY centric solution but it appears that while a number of people complain about it, the developers are not particularly interested in fixing it because it is hard, and since they personally never call native Windows apps from within MinTTY then it is not a real problem. Because that’s the way we do things in the open source community folks. Oh, you have problem that doesn’t actually affect me? Fuck you then. How about you write a patch and maybe I’ll consider merging it in. Actually I’ll probably just delete it but who knows.

Eventually I realized that if I actually want to have color in my Unix shell, I need a different terminal and I eventually stumbled upon ConEmu which is rather aggressively developed, all purpose Windows terminal. By all-purpose I mean it can actually work amazingly well as a Powershell terminal. By default it wraps itself around cmd.exe and gives you slightly less shitty command line experience. It supports tabbed interface, transparencies and has a nifty status line.

If you want to run Cygwin inside of it, all you need to do is to create a shortcut (or a script) that will call it like this:

C:\ConEmu\ConEmu64.exe /cmd "C:\cygwin\bin\bash.exe" --login -i

Obviously, you may need to adjust the paths a bit. I installed both Cygwin and ConEmu in the root directory on my c: drive because c:\Program Files (x86) is the worst path in the universe, and whoever decided that a crucial system path needs two spaces and two non-alphanumeric characters in it needs to be slapped upside the head with a hedgehog.

Since the ConEmu developer envisioned the terminal as a multi-purpose tool, the ANSI color code support is baked in. This means that everything just works as it should without having to do anything:

ConEmu Output

ConEmu: Both ANSI and Windows color codes work.

Hell, the default UI even comes with Solarized theme presets you can enable from the Settings menu. This was rather surprising to me, because very few Windows centric tools know or care about the Solarized project. In fact, quality of a windows tool aimed at developers can be almost solely judged by the degree to which it supports these themes. If it ships out of the box with default solarized theme, it means serious business – download and/or buy it immediately. If it has it as an option, it’s probably pretty good. If it’s missing… Well, as long as you can use custom themes it might still be OK, but it will probably suck.

ConEmu has 3 versions of the dark theme, and two of the light theme and they are all off in various ways. I know this because I’m a Solarized junky and I install it on everything and I write my scripts to output colors in such a way that it looks good in these themes. All of the included themes got the overall look right, but then failed at rendering certain colors (especially red and green and pink/purple) which are integral parts of my bash prompt. So I had to tweak it a bit:

Adjusted ConEmu Solarized

My adjusted version of ConEmu Solarized-Dark

You can compare it to the default color setup to see the changes I made. I think it is still slightly off, but at this point it doesn’t bother me anymore. Still, the creator gets points for trying. While the implementation wasn’t perfect, it was close enough that it took only a few seconds to fix.

The only real gripe I have with ConEmu is that it’s color palette is rather limited and it doesn’t let me do things like run Vim with Solarized Light theme while coloring itself according to Solarized Dark. MinTTY handles this sort of thing swimingly and without complaining. That said, when I’m working on Windows I typically use Gvim rather than the console vim so this is less of a problem than just a minor nuisance. Besides, if I do want to use Console vim, I can always fell back on MinTTY and just deal with the lack of color in git output.

I’m not actually sure if this post will be useful for anyone who isn’t me, because like I said – I’m a bit of a weirdo when it comes to my windows shell environment. I like it to work and feel the same way as my unix and mac environment. What do you guys use? Do you run Cygwin on windows? What terminal do you use? How about Powershell? Let me know in the comments.

]]>
http://www.terminally-incoherent.com/blog/2013/07/31/unix-ansi-colors-and-windows-colors-in-the-same-terminal/feed/ 5
PowerShell: Delete Files Smaller than 10MB http://www.terminally-incoherent.com/blog/2012/06/27/powershell-delete-files-smaller-than-10mb/ http://www.terminally-incoherent.com/blog/2012/06/27/powershell-delete-files-smaller-than-10mb/#comments Wed, 27 Jun 2012 14:02:10 +0000 http://www.terminally-incoherent.com/blog/?p=12276 Continue reading ]]> Remember how I always talk about redundant backups? Let me tell you a story about what happens when you don’t have them.

Few months ago I purchased a 2TB external drive to replace an older LaCie drive that I suspected would fail soon. Why? Well, it was becoming progressively louder, and sometimes it would take up to 10 minutes to “spin up” before the system could detect it. I wasn’t actively using that drive for anything vital but it still contained about 200GB of random crap – mostly movies, random torrents and etc.. Most of that stuff was either easily recoverable, or I didn’t care about it that much so I didn’t see much reason to back it up. Still, as long as it was there I did not want to delete any of that stuff. So I moved it to the new drive and after few days wiped the failing device clean intending to use it as an extended /tmp directory.

Fast forward few weeks and the new 2TB drive failed hard without a working. One day it just stopped working taking all my files with it. I was mildly annoyed, as I would have liked a chance to wade through that garbage to see if there was anything worth saving there. But I accepted the failure as a lesson for the future: unless you are going to delete it within the next 2-3 hours, it is worth saving in at least two places.

Fast forward to present day, and I experienced a sudden epiphany. I haven’t really been using the old, failing drive for anything all these months. The data I deleted from it might not all be gone. I promptly fired up Recuva and managed to recover few thousand files from that drive. Unfortunately it was not all of it because I did use the drive a few times to save random garbage. What’s worse, while Recuva managed to pull a lot of my files out of the brink of oblivion, it did loose the folder structure. More or less it just unceremoniously dumped everything into a huge, disorganized pile in a single directory, leaving me to wade through the wreckage trying to decide what goes where. Things I actually cared about (movies, ebooks, iso files) were lost in the sea of random jpegs, dat files and corrupted file fragments.

I started sifting through this junk but I realized it would take hours to organize. So I started writing a script that would knock out lion share of the random garbage. Initially I planned to do it by extension – just delete all the stuff that was not movies, pdf’s and etc.. But there drive was littered with vast number of different file formats so making a list (black or white) was going to be tedious. Then it dawned on me – I can just do it by size. Chances were that anything smaller than 10MB was either not worth my attention or not worth saving.

Small problem? I don’t think it is possible to write a Windows batch script that deletes files based on their size. Or at least I couldn’t figure out a way. I was about to whip out some Unix to deal with this but then I stumbled over a PowerShell icon on my way there.

“Hey, why not…” – I said to myself and got to work. It turns out that Powershell was made for stuff like this. Observe:

ls | where {$_.Length -lt 10mb} | Remove-Item

The keyword ls here is not the unix command, but rather an alias to gci which is short for Get-ChildItem wich is essentially the PowerShell version of ls. So in a way it is ls I guess.

The where keyword lets you iterate through all the listed files. Perl hackers will probably recognize the $_ which here stands for current item. The -lt is PowerShell speak for “less than” and Remove-Item should be self explanatory.

Verdict: PowerShell ain’t so bad. It is a little bit weird at first, but you get used to it.

Oh, and if you are about to say “why didn’t you just order the files by size in Explorer and hit delete”, don’t. I could have done that, sure. But I wouldn’t have a blog post if I did that, would I? So take that Mr. or Ms. Hindsight.

]]>
http://www.terminally-incoherent.com/blog/2012/06/27/powershell-delete-files-smaller-than-10mb/feed/ 5
AirPrint http://www.terminally-incoherent.com/blog/2012/06/25/airprint/ http://www.terminally-incoherent.com/blog/2012/06/25/airprint/#comments Mon, 25 Jun 2012 14:05:56 +0000 http://www.terminally-incoherent.com/blog/?p=12264 Continue reading ]]> Have you ever tried printing something from an iOS device lately? Can you explain to me why Apple decided not to give power users a way to use standard network printing protocols?

Apple devices do not to TCP/IP printing. At all. Instead they use a proprietary apple protocol known as AirPrint which, in my honest opinion is both brilliant and horrible at the same time. It is brilliant, because if you live in an Apple friendly ecosystem (and by that I mean you own and AirPrint enabled printer or you have your printer hooked up to an AirPort Extreme) it just fucking works™. This is how it looks like:

It just works

It just works

Like fucking magic (which by the way is different from regular magic) it just shows up in your print menu. There is no setup, no drivers, no futzing around with IP addresses – you just hit print, and it does. The printers advertise their presence on the network, and the iOS devices know how to find them. It is amazing. This is the sort of thing that makes Apple users such snobby jerks – if you do things the Apple way things just fall into place and there is little to no friction at the interface between human and his technology.

The problem is that we don’t live in an Apple-centric world. Most households have an eclectic a mix of different technologies that usually manage to work together quite well. Sometimes though, making them talk to each other can be a hassle. If you ever had the pleasure of setting up a network printer in a mixed OS environment, with multiple windows machines, all joined to different domains and work groups without a common file/print sharing server you know it usually isn’t as simple as it sounds. The plug and play simplicity of AirPrint solves this issue in a refreshing way but introduces other problems.

If you happen to own a slightly older, TCP/IP network printer without AirPrint support, and you don’t feel like tethering your perfectly functioning device to an expensive AirPort box you are shit out of luck. No native printing from iOS devices for you.

Hold on though, AirPrint is basically just a print service, right? Your printer or AirPort box run a tiny server that then talks to the mobile devices making the magic happen. What if we make a fake AirPrint server that runs on your desktop computer, and acts as a proxy between your local printers and the wireless devices.

It turns out it has been done. In fact OSX had AirPrint functionality built in, but Apple sort of nerfed it and fuddled with it across last few OS updates so that it no longer works properly. Or at least not without some work. Fortunately some good soul made an app for that.

AirPrint Activator is an OSX application that does exactly what I outlined above – it pretends to be an AirPrint server and makes all the printers you have access to on your Mac accessible to your iOS devices. The creator essentially re-wrote the server functionality from scratch so he promises that this app will continue working even if Apple decides to rip out all AirPrint code from Mountain Lion.

Unfortunately, my Mac is a laptop, and as most laptops it is not always on. When I’m using an iPad to print something it is usually because I can’t be bothered to crack open my MacBook and wait for it to wake up. And if I do, then I might as well print from there. My Windows desktop on the other hand is on most of the time. So I searched for a similar application that would run on 64 bit Vista (shut up, I know it sucks but I don’t feel like upgrading).

I have found one rather promising lead that was echoed across various forums that promised to give you a native Windows AirPrint service. Unfortunately it involved downloading a shady looking zip file from Rapidshare, running an executable contained therein, an then applying two registry hacks, also included in the bundle. Now I don’t know about you but I don’t really trust random executables from an unknown source posted on some random forums. Running such things is like finding a dirty syringe on the street and plunging it’s rusty needle directly into your vain, because a guy on the corner said it’s totally legit cure for your hay fever. Unless you are a protagonist of a Bioshock game, you should not do that sort of thing.

There is a more legit looking application out there called FingerPrint2. Unlike AirPrint Activator it is not free, but it has a free demo, so I decided to check it out.

It turns out it is very Apple-like in design, and it follows the hassle free AirPrint philosophy quite closely. You install it on your PC, and then you see something like this:

FingerPrint

FingerPrint

You just check-off which of your available printers you want to share with your iOS devices and you are done. No setup, no configuration, no mucking with permissions. It just works. I was quite impressed. Then I tested it and my excitement waned.

FingerPrint works like magic, but it’s creators decided that giving you a fully functional demo would be too nice of a thing to do. So unless you buy a license and activate your copy, they will print a gigantic full color watermark on the first page of every print job. And I’m not talking about a small logo in the corner, or discrete diagonal text. No, they slap a gigantic high resolution, full page image with vibrant color in the middle of the page, blotting out all the text on the page. I guess they really want you to buy it.

The full license is $20 which is quite steep of a price for the functionality you can get for free on a Mac. But I guess I can’t flaw their logic – if you are an owner of an iPad or an iPhone then shelling out twenty bucks on a tiny little app that sits in the tray won’t kill you. Still, it seems like a lot of money for the functionality it provides. Then again, maybe I’m just falling into the same cognitive trap that Oatmeal aptly described in his comic.

Why is it so difficult to evaluate what software is really worth? Well, perhaps the question here is whether or not software utilities are goods that ought to be sold to begin with. Part of the reason behind the free software movement is that most people just feel uncomfortable buying or selling software. It just does not seem right.

But I digress. How do you print from your iPad? Is there a cheaper alternative or should I just suck it up and give FingerPrint people that twenty? What do you think?

]]>
http://www.terminally-incoherent.com/blog/2012/06/25/airprint/feed/ 15
Scripting Windows the Unix Way http://www.terminally-incoherent.com/blog/2012/05/14/scripting-windows-the-unix-way/ http://www.terminally-incoherent.com/blog/2012/05/14/scripting-windows-the-unix-way/#comments Mon, 14 May 2012 14:30:27 +0000 http://www.terminally-incoherent.com/blog/?p=12050 Continue reading ]]> Sometimes you gots to script windows. If it’s my personal rig I usually just use Cygwin because that’s where all the tools I need reside on Windows boxen. Either that or I just hack in Python which became my replacement for Perl after I went back and tried to read a 3 year old Perl script that broke. I know that code clarity depends on a programmer – and I’m very good at being sloppy in every language I know, but my shitty Python code is marginally more readable than my shitty Perl from that period in my life when I decided I’m really good at regular expressions.

But this is not about scripting for myself. This is about writing scripts that could possibly work on some limited range of machines that won’t have Perl, Python or Cygwin installed because they are operated by functional halfwits. More or less, the typical use case works like this – and end user walks in with a computer in tow and goes:

“Yo, my shit is all retarded.”

At that point your job is to un-retard his shit, whatever that might mean. Actually, what it usually means is that they changed, deleted or misplaced something. The usual procedure is to make them download and run bunch of installers, reset their home pages and re-jiggle their thingymabobs. This could be done by hand, but it is usually tedious. I have already created a tool that does a lot of such tedious work for me. While said tool became an indispensable asset for me, I try to keep it a generalized, all purpose tool – a Swiss Army Knife of sorts. I needed a set of specialized scripts that would parse, change, delete, download and run files to do some very specific tasks. Tasks that may periodically change – where periodically is defined as “more often than I would want to compile the damn code”.

Most of the time, you would probably do this sort of shit in VBScript. Before Powershell was a thing, VBScript was the go-to scripting language on Windows. It still is, seeing how it is not installed by default on Windows XP which is still on roughly half of the machines I have to deal with. The problem with VBScript is that it is a shitty language – and a verbose one too.

Let me give you an example – when I’m on Linux, Unix, Cygwin or a Mac, and I need to download a file from the interwebs, all I need to do is:

wget http://example.com/somefile.zip

In VBScript this is slightly more complicated:

URL="http://example.com/somefile.zip"
saveTo = "c:\some\folder\somefile.zip"

Set objXMLHTTP = CreateObject("MSXML2.XMLHTTP")
objXMLHTTP.open "GET", URL, false
objXMLHTTP.send()

If objXMLHTTP.Status = 200 Then
   Set objADOStream = CreateObject("ADODB.Stream")
   objADOStream.Open
   objADOStream.Type = 1 'adTypeBinary

   objADOStream.Write objXMLHTTP.ResponseBody
   objADOStream.Position = 0 'Set the stream position to the start

   Set objFSO = Createobject("Scripting.FileSystemObject")
   If objFSO.Fileexists(saveTo) Then objFSO.DeleteFile saveTo
   Set objFSO = Nothing

   objADOStream.SaveToFile saveTo
   objADOStream.Close
   Set objADOStream = Nothing
End if

Or something along these lines. I actually did not test this – I shamelessly lifted the code from here. I just don’t care enough to actually make sure it’s correct – if it’s wrong, then it’s wrong. Don’t use that code. I guess what I’m saying here is that VBS is a shitty general purpose programming language that can be used for scripting, but it ain’t pretty. It was designed by people and for people who thought that Visual Basic was a good idea and it shows. Unix shell on the other hand is an elegant command line environment with a smorgasbord of nifty tools that work beautifully out of the box. Tools that are self contained, mature, tested and follow the unix philosophy of doing just one thing, but doing it well.

While the script above may poorly imitate a fraction of functionality of wget, but is flimsy, ugly and pain in the ass to maintain. It may solve one problem (downloading files from the internet) but wget is not the only tool I would like to use on a daily basis. There is abut a dozen of other GNU utilities that I wold like to have on Windows: sed, grep, diff, patch, head, tail, touch – just to name a few. All extremely useful, all nontrivial to re-create functionality-wise in VBScript.

For example – why spend an hour fiddling with VBS string processing functions and end up with about a 100 lines of unspeakably ugly code (90% of which is boilerplate and padding) if you could write 4 regular expressions and feed them to sed to accomplish the same thing. Granted, regexps are unspeakably ugly in themselves most of the time, but it’s still 90% less ugly per volume if you think about it.

The standard windows scripting environment (cmd.exe) is less verbose and more like unix shell in some aspects. It’s unfortunate that it is hampered by it’s syntax, and a very limited set of utilites. Powershell is much better in this aspect, but it is both more verbose and vb like and not as ubiquitous.

If you could only somehow “borrow” bunch of GNU shell utilities and bundle them with your standard Windows batch scripts, you could actually have quite a powerful tool at your heads. And I’m not talking about Cygwin. Yes, it is nice but often you don’t want the whole kit and caboodle – a separate shell with it’s own set of environmental variables, it’s own filesystem hierarchy is an overkill for a lot of task. Ideally you’d just want cherry pick select utilities – for example, if your script only needs sed and wget, then you would only include these.

Some time ago, I have discovered an old, but still somewhat relevant project called Unix Utils. It’s aim is basically to create dependency free Windows ports of all the core Unix utilities. The package ships with a rudimentary shell (sh.exe) but the tools in the usr/local/wbin are actually completely portable. You can extract the entire package, take out wget.exe, drop it in the same directory as your batch script and it will work.

The downside of this method is that it creates dependencies for your script. If you distribute it via email, you need to include all the external GNU executables with it. This is a problem, since your average office drone can’t be trusted to properly extract a zip file. I tried – on average my users failed to unpack such a bundle 13 times out of 10. No it’s not a typo – that’s just how hard they failed.

Alas, there is a tool that can help with that. It’s called WinRar and everyone loves it. I know, because I once made a poll and WinRar kinda won. WinRar is a neat compression tool, but it also has the ability to build so called SFX archives – self extracting bundles that can be instructed to run a program when they unravel themselves. You can do that directly from a GUI but it is tedious – a lot of clicking is involved. If you will be building and re-building your batch scripts a lot (and you will) you want something you can automate. Fortunately everything you need is in the WinRar program directory:

  • rar.exe – is a stand alone command line version of WinRar
  • Default.SFX – is a binary header that gets appended to the self extracting archives

You can grab those two files from the WinRar program directory and put them wherever. As long as both are in the same directory you don’t even need WinRar installed on the machine where you will be building the SFX bundle.

Next step is to create a config file sfx.conf where you specify where and how the bundle is to self extract. Here is an example:

Path=%TEMP%
Setup=%TEMP%\somedir\batchscript.cmd
Silent=1
Overwrite=1

Quick explanations:

  • Path – is the directory where you want to extract your shit. I’m using the temp directory.
  • Setup – is the program to be run after successful extraction. Note that I’m assuming that the bundle will extract to a sub-directory called somedir.
  • Silent – setting this to 1 suppresses the GUI extraction dialog
  • Overwrite – ensures that old files get overwritten as they are extracted

Now, you put you batch script and all the things you want to bundle with it in somedir\. Outside you put rar.exe, Default.sfx and sfx.conf. Once everything is in place, you run this command (or, you know – make it a script):

rar a -r -sfx -z"sfx.conf" setup.exe somedir\

Boom, now you have an executable called setup.exe which will quietly self-extract to temp dir, and run your batch script allowing it to call any and all binaries you included with it.

You want a practical example where this might be useful? Here is a script that changes the home page in Chrome. Changing IE homepage is somewhat trivial – it requires a simple registry hack. Changing it in Chrome, after it was already installed and configured is a tiny bit more complex. Essentially you need to parse the users’ Preferences file and change two values in it. This can be done in a number of ways, but being a unix geek I opted for something like this:

@echo off
set ppath=%USERPROFILE%\Local Settings\Application Data\Google\Chrome\User Data\Default
sed -n -f chrome_homepage.sed "ppath%\Preferences" > "%ppath%\Preferences.txt"
del "%ppath%\Preferences"
move "%ppath%\Preferences.txt" "%ffile%\Preferences"

Here is the Sed script that does the actual work:

s#homepage\": \"[^\"]*#homepage\": \"http://example.com#
s#homepage_is_newtabpage\": true#homepage_is_newtabpage\": false#

If you are having trouble reading it it’s because I’m using hash-marks (#) instead of forward slashers as regexp delimiters to avoid the zigzagging pattern of escaped slashes that usually accompanies regexps that deal with URL’s.

My SFX archive then includes 3 files – the batch script, the sed script, and the sed.exe executable from UnixUtils project. The user gets a bundled executable that will briefly flash a command line window for a split second, and his home page will be auto-magically reset to the proper value.

Is this the best possible way of doing this? No, probably not. It’s rather unorthodox, and old time Windows admins will probably yell at me for doing this. But it works, and it does let me accomplish a lot of complex tasks using good old Unix functionality without having to bang my head against the wall debugging VBS code, or forcefully install Powershell on WinXP machines.

]]>
http://www.terminally-incoherent.com/blog/2012/05/14/scripting-windows-the-unix-way/feed/ 8
Value too large for defined data type http://www.terminally-incoherent.com/blog/2012/03/28/value-too-large-for-defined-data-type/ http://www.terminally-incoherent.com/blog/2012/03/28/value-too-large-for-defined-data-type/#comments Wed, 28 Mar 2012 14:30:21 +0000 http://www.terminally-incoherent.com/blog/?p=11678 Continue reading ]]> In my super-massive vim post I have mentioned that CTAGS are the bees-knees of vim lifestyle. A lifestyle which may not be as glamorous as that of marketrioid marsupials for example, but we fucking like it. I just found a wee bit issue while tagging code, so I decided to document it here.

You see, one of my original purposes for writing this here online diary of sorts was chronicling my miss-adventures in technology. Back then I was in school, learning to program, and discovering Linux. Whenever I learned something interesting, or solved some issue I always had that feeling of “I should probably write this down somewhere, because it is bound to come up again”. Naturally I never did, and kept slamming face first in to same exact problem every 6 months or so (ie. every time Ubuntu decided to fuck me over releasing a shiny new update that of course would break everything on my laptop). Eventually I realized that I could use my blog (which then I used mostly for “I am angry at this news article I found on /.” posts) as a virtual sysadmin notebook. So I would write down my problem, my solution, and promptly forget about it. Six months later I would have a temporary bout of amnesia and do apt-upgrade-and-break-everything again, because compiz got like a new shiny bauble or something. This would invariably make my laptop stop working in all possible ways, and I would spend many unhappy hours googling stuff barfed up by dmesg. And I would find my blog, with the solution.

Since then Terminally Incoherent has evolved into something entirely different. But I still need a place to chronicle weird technology things. I could start a new blog for that, but… You know… Lazy. So I will leave this here if you don’t mind.

The other day I was happily running ctags -R on some php code and suddenly it vomited a whole slew of errors like this at me:

ctags: Warning: cannot open source file “index.php” : Value too large for defined data type

Apparently every single file in that directory was “too large for defined data type”. Of course the first thing I did was to google that message, but that didn’t actually do me much good. Turns out that this message has nothing to do with exuberant-ctags, and has everything to do with networking. In fact it is precisely a CIFS issue.

Here is a little background – the code I was trying to tag was hosted on a windows machine. I was sitting on Ubuntu box (yes, right on top of it) and accessing it remotely. Why? Well, I figured I might as well keep the code in the environment that closely resembles how it will be deployed (ie WIMP setup). At the same time I wanted to be comfortable, coding on my primary work rig which just happens to run Ubuntu. So I hooked the test win-box to the local network, created a network share for the code directory and mounted it on my machine using the following fstab entry:

//server/share  /remote/s/  cifs  rw,user,noauto,username=foo,password=bar,domain=baz,uid=1000,gid=1000  0  0

It turns out that there is some weird buffer overflow condition in CIFS that only occurs in some contexts, and only when dealing with a remote Windows share. 90% of other applications either don’t notice it, or just don’t give a shit. Exuberant Ctags however throws a fit and refuses to read files on a CIFS drive. Fortunately, there is an easy workaround. You simply have to add following options to your fstab entry: nounix,noserverino.

So the above line should look like this:

//server/share  /remote/s/  cifs  rw,user,noauto,nounix,noserverino,username=foo,password=bar,domain=baz  ,uid=1000,gid=1000  0  0

Hopefully this will help some other poor soul who made the folly of using ctags on a cifs share. It will probably help me at some point down the road, when I manage to forget all about this.

]]>
http://www.terminally-incoherent.com/blog/2012/03/28/value-too-large-for-defined-data-type/feed/ 3
Creating BartPE from Windows XP Dell OEM CD http://www.terminally-incoherent.com/blog/2010/03/18/creating-bartpe-from-windows-xp-dell-oem-cd/ http://www.terminally-incoherent.com/blog/2010/03/18/creating-bartpe-from-windows-xp-dell-oem-cd/#comments Thu, 18 Mar 2010 14:24:26 +0000 http://www.terminally-incoherent.com/blog/?p=5152 Continue reading ]]> Live CD’s are great. When I first downloaded and booted up Knoppix, it blew my mind. A whole operating system that runs off a CD. But why not. The concept behind these distributions is simple enough. After all, the kernel of your OS must be in memory anyway, and most modules and applications are loaded up as needed from storage, which in most cases can be read-only. The only areas where you actually need write permissions are few select directories that the system use to dump log files and temporary garbage. But you can easily implement a virtual disk in RAM that will pretend to be a traditional read-write storage system and you are good to go. Simple, easy to implement and incredibly useful.

Initially this technique was used mostly by a range of novelty linux distributions built to be emergency system rescue platforms, tech demos or stripped down miniature OS’s that you could carry with you in your pocket. Nowadays however use of live CD’s is widespread. Large and popular distributions such as Ubuntu use them by default on their installation CD’s for example. Such a CD can then act as a demo disk, an installer and rescue disk that can allow the user to recover his data when the installed OS gets hosed for some reason. It is a beautiful, user friendly and elegant way to package the system. There is almost no reason not to do this.

And yet the dominant player in the OS market does not even acknowledge this methodology as an option. I think that everyone would agree that a Live Windows CD would be a nice addition to the toolbox of every IT professional. Of course the windows support niche has been filled out by Linux distros quite well. The NTFS support is quite good these days and Linux based tools that allow you to edit windows registry are becoming more reliable each day. Still, it is a bit ironic that you usually have to use Linux to repair or recover a Windows installation – especially considering how much money Microsoft spends on the FUD campaigns against it. You would think that someone at Redmond would notice this, and decide to create a Live Windows version instead of offering users baroque solutions such as the “Windows Recovery Console”.

Sadly, I doubt this will ever happen. After all, Microsoft has spent the last decade trying to tie their OS to the underlying hardware and make it less portable. Reversing their policy and creating a version of Windows that could be carried around on a CD, and boot on any machine without some crazy online activation scheme is probably out of the question.

Of course trying to stifle progress is futile. If enough people want Live Windows CD, it will be made with or without Microsoft’s help. Enter BartPE – a project to create just that, Microsoft be damned. Of course since Windows is a proprietary OS, it is not possible to distribute an actual Live CD of it without incurring legal wrath of the software giant. What can be distributed however is a set of tools that will take your existing (legally licensed) Windows XP CD and turn it into a Live CD. Unless of course you happen to have an OEM version of the CD in which case it does not work.

Not so long ago I ran into a scenario that a live linux CD could not fix. I was dealing with a computer protected by Pointsec for PC full disk encryption. The windows installation on the system was hosed most likely due to a HD damage – at least that’s what I suspected judging from the agonizing grinding noises it was making while trying to load Windows. It was clear that parts of the file system are still intact though because the system would hang or crash at different points during the boot procedure. What I needed from that machine were 3 excel and PDF files that could not be easily recovered if they were lost. The user naturally didn’t back up, because hell – why for, right?

I could of course attempt to decrypt the whole drive (I had the recover file, and admin passwords to do so), but that seemed like a risky move. With a hard drive on its last legs, last thing I wanted it to do is to work real hard for several hours copying bits all over the place. What I wanted to do was to get in, and access the files I needed before the drive collapses upon itself. Live distro was the way to go.

Check Point is actually nice enough to offer a BartPE plug-in on their installation CD’s. They don’t talk about it though, probably because they don’t want to tempt Microsoft which pretends BartPE does not exist. The plugin is unsupported but it works well enough for what I needed it to do. The problem was that to build a BartPE disk I needed a Windows XP CD and the only copy I had on me was a Dell OEM with Service Pack 2 included on the disk. My initial build failed miserably because that Windows CD is crippled in some subtle way.

I did some googling and found a writeup that explains how to use Dell OEM disk to create an UltimateBootCD. It is not exactly what I needed, but following these instructions yielded quite positive results. I had to tweak the writeup in a few places, but for the most part it worked. I will reproduce it here with less atrocious formatting and appropriate corrections.

CD Used for the Build: Windows XP Pro SP2 DELL OEM
Platform on which the build was performed: Windows XP Pro SP3

The procedure:

  1. Copy the contents of the CD to some directory (say C:\WINXP). Make sure you are copying hidden and system files as well as normal files.
  2. Go to BartPE installation directory
  3. Go to the Plugins folder
  4. Create a directory named Dell
  5. Inside create a file named fixdell.inf
  6. The contents of fixdell.inf should be as follows:
    [Version]
    Signature= "$Windows NT$"
    
    [PEBuilder]
    Name="Fix Dell Windows XP OEM boot problems"
    Enable=1
    
    [SourceDisksFiles]
    iastor.sys=4,,4
    a320raid.sys=4,,4
    aarich.sys=4,,4
    aac.sys=4,,4
    cercsr6.sys=4,,4
    afamgt.sys=4,,4
    NvAtaBus.sys=4,,4
    nvraid.sys=4,,4
    symmpi.sys=4,,4
    megasas.sys=4,,4
  7. Download subinacl from Microsoft website here
  8. Install it (choose all default options)
  9. Go to C:\Program Files\Windows Resource Kits\Tools
  10. Copy the subinacl.exe to C:\WINXP\i386\ (or change WINXP to whatever you named your folder in step 1
  11. In C:\WINXP\i386 create a batch file (eg. fixdell.cmd) with the following:
    reg query HKU | find /i "pebuilder" > fixdell.txt
    for /f %%a in (fixdell.txt) do reg unload %%a
    reg load HKLM\DELL setupreg.hiv
    subinacl /subkeyreg hkey_local_machine\dell\controlset001\services\iastor\ /objectcopysecurity=hkey_local_machine\dell\controlset001\services
    reg unload HKLM\DELL
    del fixdell.txt
    del /ah setupreg.hiv.log
    echo Check output to see if there are any errors.
    pause
  12. Run the batch file.
  13. Reboot
  14. Build BartPE CD
  15. ???
  16. Profit

Note that if BartPE throws up warnings about missing files, you should probably go back to step 6 and add these file names to your fixdell.inf file using the same pattern as all the other ones.

If you have a SATA drive, you may need to also slipstream the SATA drivers into your build. You need to do that before you actually build the BartPE CD:

  1. First download the DriverPacks Base and extract it anywhere (eg c:\dp\
  2. Next, grab the DriverPacks Mass Storage bundle. It is currently only available via torrent so if you are behind a firewall, this might be a tricky step.
  3. Take the DP_MassStorage_wnt5_x86-32_1209.7z bundle you just torrented and put it in c:\dp\DriverPacks directory.
  4. Run DPs_BASE.exe (it’s located in your c:\dp\ directory.
  5. In the Location section page select BartPE use the browse button “Browse” to select your BartPE Plugin\ folder.

    Driver Packs Location

    Driver Packs Location

  6. Make sure that DriverPack MassStorage text mode is selected under the DriverPacks™ section:

    Select MassStorage Pack

    Select MassStorage Pack

  7. Hit Slipstream!. This will extract the drivers into your BartPE plugin directory.
  8. Finally, run the BartPE builder tool, and make sure your plugins are initialized correctly. The #DriverPacks.net – BASE should be set to No and #DriverPacks.net – Mass Storage should be set to Yes, as pictured below:

    BartPE Plugin Setup

    BartPE Plugin Setup

  9. Finally, build your BartPE CD.

This ought to give you a CD that can both access encrypted Pointsec drives, and deal with SATA.

To finish my story – it worked. I was able to create a BartPE disk with the Pointsec for PC plugin and then use it to recover the files from the encrypted drive. There is a little trick to it though – you can’t just boot BartPE from disk as usual. You have to let the machine load the Pointsec Pre-Boot Environment, log in as a user authorized to access the machine and then hit Ctrl+F10. This will shoot you over to a customized Pointsec boot selection menu, where you can choose to start the system from the CD. That’s actually the only way to do this. If you allow the system boot from the CD before the Pointsec Pre-Boot Environment kicks in, the encrypted disk will remain inaccessible.

I hope this helps anyone who is trying to build BartPE with a crippled OEM disk. I know this works for Dell OEM, but chances are it might work for other versions too.

]]>
http://www.terminally-incoherent.com/blog/2010/03/18/creating-bartpe-from-windows-xp-dell-oem-cd/feed/ 6
How to make Norton 360 Pishing Filter Go The Fuck Away http://www.terminally-incoherent.com/blog/2010/02/11/how-to-make-norton-360-pishing-filter-go-the-fuck-away/ http://www.terminally-incoherent.com/blog/2010/02/11/how-to-make-norton-360-pishing-filter-go-the-fuck-away/#comments Thu, 11 Feb 2010 15:49:48 +0000 http://www.terminally-incoherent.com/blog/?p=4911 Continue reading ]]> Dear internet, please remind me not to do free tech support anymore. Seriously! I mean it. Next time you see me agreeing to this bullshit just smack me upside the head or something.

To make a long story short, an acquittance asked me to solve their “bizzzzzzzzzare” computer issue the other day. And yes I was told that the problem is so bizarre they needed several extra Z’s for emphasis. I guess it was a trade in – bunch of Z’s for the second R. Sounds like a pretty decent deal to me.

The problem at hand was succinctly summarized to me as “Gmail don’t work”. Not the internet, not the browser – just Gmail, and only on that one computer which happened to be an old, decrepit dell machine. I decided to investigate, so I had the person log into her Gmail for me to see what happens. The page loaded fine, albeit slow – but I attributed it to the general slowness of the machine. So… I guess it’s fixed? Can I go home now?

“Watch this!” she exclaimed while clicking on the first unread email. The mouse turned into a hourglass, and IE started loading something very slowly. Several seconds passed, and nothing changed. So my friend decided to click on it harder and then even harder afterward. When that didn’t work, she decided to change strategy and click on it really fast – as if she was trying to communicate with the computer using morse code. Eventually IE said “fuck it” and displayed the famous “(Not Responding)” message in the toolbar.

“See? Every time I go into Gmail it crashes!” she said triumphantly, clearly pleased with this demonstration. Of course IE did not crash – it simply stopped responding because you just clicked on it like a million times. You have to understand that when you have a computer that was made at the begging of the bronze age, you can’t expect it to have instant response time. So I asked her to wait a bit, and about a minute and a half later, IE popped back to life, displaying the email she requested. Still, the response time was slow, even for an old computer. What’s more intriguing, other websites loaded much faster.

I installed Chrome on her system and ran a quick test. Gmail performance was blazing fast compared to IE. Unfortunately, my friend preferred that I fix IE instead of trying to force alternative browsers upon her. So I sat down and started digging. Pretty quickly I narrowed down the performance to the Phishing toolbar provided by her copy of Norton 360. I disabled the Phishing protection, restarted IE and Gmail started working normally. I enabled it, and Gmail would take a whole minute to load an email.

Norton 360 is a piece of crap, but since she just renewed the license for another year uninstalling it was not a viable option. So I recommended simply keeping the Phishing protection disabled from now on, which would work if Symantec did not go out of their way to make their software super intrusive. As soon as I disabled the feature, Norton but a huge red X in the task bar. Then, every few minutes it would display a popup message that warned the user she was not protected, taunting her with a large, alluring “Fix Protection Now” button. I looked through all the available options, looked online and even read the help files searching for a way to disable this message. Turns out there is no way to do this. You either run with the phishing filter on, or you have to deal with annoying nag messages popping up all the time.

I tried to explain this to my friend, but she did not listen to me. Lusers never do – it’s their defense mechanism. You see, if a computer illiterate people would even accidentally learn something about computer or technology in general, it could make them lose their street cred. So whenever one of us tries to explain something to them, they just switch off. Their eyes glaze over, and they nod politely – but they are not really there. Their mind is adrift somewhere else. So the conversation went something like this.

Me: So I just ignore these messages for now and don’t click “Fix” when it complains about the Phishing filter
Her: Ok. Got it.
Norton 360: WARNING! WARNING! YOU ARE NOT PROTECTED! CLICK HERE TO FIX!
Her: *quickly clicks on the fix button*
Me: facepalm.jpg

It was fairly clear that this solution would not work. So I started to dig around. I opened IE, pulled up the Manage Add-onns panel and found the entry for Norton Anti-Phishing toolbar (which was for some reason called “Norton Confidential” in the list). I noticed that it was linked to a file called CoIEPlg.dll. On her system it was located in:

C:\Program Files\Common Files\Symanted Shared\coShared\Browser\2.6\

So I decided to do something silly – I renamed that file and then re-enabled the phishing protection via the Norton 360 settings menu. Strangely enough, the nag message went away and the icon in the task bar turned green. I verified that Norton did not quietly restore the file while I was not looking and I tested IE. Surprisingly it launched without a hitch, simply ignoring the missing file completely. It also did not have any issue with Gmail this time around. I rebooted the machine few times to make sure it won’t restore the DLL at boot time. It did not. Amazingly, this ugly hack worked… Which I guess just goes to show that Norton is a piece of shit product. If it was worth the money people spend on it, it should have prevented me from doing this. It should have also had an option to disable Phishing filter without incurring the wrath of the Annoying Nag Bar Monster.

Oh, and don’t tell me I made her computer less secure this way. I did not. When I arrived, the toolbar was hidden in IE which prevented any substantial Phishing warnings from being displayed. She was not using it at all – merely doing her best to ignore it. So yes, this fix theoretically broke Norton, but fixed her Gmail. I installed a full version of Malwarebytes on that box which probably does more for protection than the whole memory hugging, bloated Norton 360 suite.

TLDR: I broke my friends POS Norton 360 installation to fix her Gmail. Also, why the fuck am I still doing free tech support? Someone stop me next time.

]]>
http://www.terminally-incoherent.com/blog/2010/02/11/how-to-make-norton-360-pishing-filter-go-the-fuck-away/feed/ 4
The Strange Case of Missing Hard Drive Space http://www.terminally-incoherent.com/blog/2009/12/14/the-strange-case-of-missing-hard-drive-space/ http://www.terminally-incoherent.com/blog/2009/12/14/the-strange-case-of-missing-hard-drive-space/#comments Mon, 14 Dec 2009 15:07:25 +0000 http://www.terminally-incoherent.com/blog/?p=4458 Continue reading ]]> Few days ago, a user brought us a machine with some irrelevant software related issue. It was a relatively easy fix and we got the machine back to usable state in no time, only to discover something strange. The machine had a 30GB drive and only few MB of free space. This was causing all kinds of issues, as Windows didn’t have enough space for it’s page file, and kept complaining it can’t save System Recovery snapshots. Obviously we were not going to give it back like that to the user.

When someone shown this to me, I jokingly suggested that they locate and delete the guys hidden pr0n folder. It’s not like he will complain or anything, considering this is a company issued laptop. This is where having a nice disk usage visualization built into the UI would really come in handy. Instead we had to either do this manually (by looking at the size of each folder), or use a third party tool such as Windirstat.

The results of the search were inconsclusive. The Documents and Settings folder was merely 4GB. The Program Files was around 3GB. Where was the space going? Into the Windows folder!

Yep, the Windows folder was over 20GB on a 30GB drive. How the hell does that happen? Windows is known for crufting and growing, but not by that much. There is no way in hell the folder holding the system files should be this big. Unless of course the guy hid his pr0n folder in there. He didn’t though. The reason for missing space was even more bizarre.

The missing 20-something gigabytes was all contained in a single hidden system folder:

C:\WINDOWS\Installer

Inside there was over 3 thousand files, most of which had randomly generated names and nondescript extensions. Furthermore click glance at the size column revealed that almost all these files were identical in size.

What is the Installer folder used for? Apparently this is where Windows stores and unpacks MSI files it downloads via Microsoft Update service. Sometimes the temporary files are not cleared out when the installation fails for some reason.

In this case, Windows was trying to install .NET Framework 1.1 Service Pack 1 roughly twice or three times a day for about a year. Each time the installation would silently fail, and leave a randomly named temporary installation file in the aforementioned directory. No one ever noticed. Slowly but steadily this ate away 20 GB of free space, in neat few MB increments.

Our solution was to delete all the dupes from the Installer folder, and then remove all versions of .NET from the machine, install the newest Microsoft Installer version, re-instal both .NET 1.1 and 3.5, then run Windows Update to make sure all service packs get applied properly.

Apparently this is not an isolated issue though, and the problem can be caused by more than just .NET updates. So if your computer is low on disk space, and you don’t know where it all went, check the Installer folder. It is a system directory, so by default it will be hidden. You have to fiddle with folder settings in order to view it.

]]>
http://www.terminally-incoherent.com/blog/2009/12/14/the-strange-case-of-missing-hard-drive-space/feed/ 14
Vista Disk Usage Bars http://www.terminally-incoherent.com/blog/2009/12/03/vista-disk-usage-bars/ http://www.terminally-incoherent.com/blog/2009/12/03/vista-disk-usage-bars/#comments Thu, 03 Dec 2009 15:24:23 +0000 http://www.terminally-incoherent.com/blog/?p=4319 Continue reading ]]> There is one feature I actually really do like in Vista, and I wish other environments implemented it as well. Guess what it is? Oh, right… The title of the post sort of gave it away, didn’t it?

Yes, I really like these things:

Vista Disk Usage Stats

Vista Disk Usage Stats

I especially like how the bar turns red as it approaches the right edge. Visuals – graphs and charts really do help us to see the big picture. Normally seeing 40+ GB of free space I’d assume I have plenty of space left. I kept running into this issue all the time in the past – I wouldn’t start worrying until the quantity of my free GB ended up being a single digit number. Not that I didn’t know I’m running low – I just didn’t care.

Seeing you are slowly running out of drive space however is very different from knowing. When I see that red bar under my C: drive I feel a strong compulsion to delete stuff I don’t need, and un-install software (mostly games) that I no longer use. Bare numbers do not evoke that feeling in me. Colorful bar graphs do.

This makes me wonder why don’t we use visualization for more file system related stuff. For example, can we have a window manager that overlays a tiny pie chart in the corner of each file and folder? It would show relatively how much of total drive space is taken by that particular item. That would be extremely useful!

In fact “How do I find out what is taking up so much space on my HD?” is the third most popular question I get asked by non-technical folk. Number one of course being “why is my computer slow” and number two: “why do I get that error message I didn’t bother writing down, because I assume you can read my mind”. There is of course no easy answer to any of these. Most desktop environments do a pretty good job communicating disk usage info broken down by file/directory to the user. That’s why there are hundreds of little disk analysis tools out there that attempt to help you with this. Simplest and most popular probably being xdiskusage on Linux. It gives you a very simple visual breakdown of what takes the most space on your drive:

xdiskusage breaks it down for you

xdiskusage breaks it down for you

Still, this tool will actually take several minutes to collect this info and calculate the percentages because most file systems do not keep information about combined size of files in a directory. After all, directories are purely virtual constructs that are there only for our convenience. So drawing nice pie charts or graphs on folders would require us to do traverse the folder tree down, calculate combined file sizes and then cache it for future reference. That usually means performance hit, every time the system needs to update the cache. Someone already did this for Windows but didn’t go as far as adding visualization.

With journaling file systems you could just use a background service. It wouldn’t have to monitor all drive activity – only look check whether the journaled changes affect folder sizes that have been cached already and recalculate as needed.

Looking into the future, the relational file systems (if they ever catch on) could do even better than this – making the calculation just a matter of a single optimized query.

This is something that actually can be done – we can implement this. There is nothing we could do about machines slowing down due to user loader malware. There is not much we can do to prevent stupidity driven error messages. But we could make it easy for regular users to visualize their disk usage, and identify problem areas.

]]>
http://www.terminally-incoherent.com/blog/2009/12/03/vista-disk-usage-bars/feed/ 11