Cool (or mundane) computer trick impresses co-worker

I managed to impress someone at the office this week with a cool (read mundane) computer trick.

I got a call from the secretary, who is a few seconds’ walk from my desk, asking for a scanned version of my hand-written signature. I replied that on my computer at home I have it, and I could easily get it within a few minutes; she replies that it would be faster for her to just walk over with a piece of paper for me to sign, which she would then scan and play around with.

And this is where I began to impress her: By the time she got to my desk with said sheet of paper, I had already VNC’d into my home server’s desktop and was in the process of doing the same from the server to my main computer’s desktop (gotta finish the process of giving it a static IP and setting it up so that I don’t have to go through my home server. 🙂 ) I finished logging into my desktop, and looked in the likely directory, and voilà! I fired up my home email client, and within a couple of minutes, she’d received my scanned signature.

Beyond the fact that the Gnome desktop is set up standard to do VNC — and the fact that I installed TigerVNC instead of using the standard Gnome Remote Desktop Viewer — too bad that I can’t really claim that this is a cool Linux trick, since my computer at work is Windows, and you can set up Windows boxes to “pick up the phone” too ….

She was still impressed, though. And it took about as much time as the whole process of signing a piece of paper, scanning it, cropping it, etc.

Canola oil instead of petroleum oil car treatments and ethanol blends

I was impressed the other day when I finally got around to rustproofing my car at Antirouille Métropolitain, a chain of rustproofing businesses in Quebec. My car is 13-14 years old and has virtually no rust, although I have to repaint the running board on the driver side yet again, I let things go too long over the past few months so the rust is starting up, but it’s not bad at all. Yet.

They asked me “do you want the traditional oil based treatment or the “bio” treatment? It’s dripless and made of canola oil.”

Apparently the selling point with most people was that it’s dripless, vs. their traditional oil treatment, for which the optimum formula is necessarily drippy. For me the selling point was that it’s canola oil, and the dripless part was just a secondary bonus. This doesn’t affect their usual performance guarantees.

After I’d paid and while the technician is prepping my car and even starting the treatment, I asked the man behind the counter “Aren’t you going to tell your technician to use the canola oil treatment?” To my surprise, he replied that their default policy is to treat cars with the canola oil unless the customer expressly asks for the traditional oil treatment, in which case he would then inform the technician to use “the old treatment”.

The story works out that it took three years to develop the product so that its effects would be equivalent to the traditional oil treatment they developed, and they spent the more two years doing road tests before widespread commercialization of the treatment. They started commercializing the treatment in early 2009. Apparently, the canola oil treatment is the overwhelming choice at this location, as well as business wide to varying degrees — no doubt due to some clever marketing and a highly refined counter-level sales pitch that had me sold hook, line and sinker — to the point that it they sell perhaps one or two traditional oil treatment per week, if that; apparently the principal selling point, as mentioned earlier, is that it’s dripless. In urban centres such as Montreal and Quebec City, this is a big selling point because people don’t like having oil drip marks in their driveways and on their garage floors. In somewhat less urban centres such as Sherbrooke, the adoption rate of the canola oil treatment is down to 40% to 60% apparently because the market, having a larger rural clientele, isn’t as likely to have asphalt driveways or concrete garage floors that would be stained by the dripping oil from their rustproofing purchase, and/or seem slower in changing old habits, such as from the “old” mentality (and old sales pitch) that it being drippy is a necessary side-effect of the formulation so that it can have its maximum effect.

So I was quite impressed that the market is slowly shifting away from some “old fashioned” treatments. Now let’s hope that the rest of the formulation doesn’t outweigh the benefits of replacing the petroleum components.

Note that for the past few months I’ve also been making a point of buying gas from Sonic since they seem to be the only mainstream chain of gas stations in Quebec, or at least in the Montreal area, that sells ethanol blends (6%-10%); they also sell biodiesel blends. Sometimes I go really out of my way or plan routes to pass near a Sonic, but usually not much since there happens to be a Sonic minutes away from home. The other Sonic I occasionally frequent is near Drummondville when I happen to be driving that way. There is another along the way west towards the end of the island. Apparently there are a few other gas stations — I presume independents — who also sell methanol blends in my area, although I have yet to locate them.

This part about the gas has been quite the reverse culture shock from Ottawa, where it’s (or was about 12 years ago when I worked there) the unusual case that a gas station either doesn’t sell ethanol blends or at least isn’t within a couple of blocks of one that does; it’s taken me over 12 years to finally get back to making a point of using the ethanol blends.

Now only if the ethanol blends were more available, and the blends were higher; however, a quick check on Wikipedia suggests that most cars with standard gasoline engines can only tolerate up to about 10% ethanol without some kind of adjustment.

PDF’s, Scanning, and File Sizes

I’ve been playing around with PDF’s for the past few weeks and have noticed a very interesting thing: A PDF is *not* a PDF is *not* a PDF is *not* a PDF, ad nauseum, and it would seem, ad infinitum. At least, so it would seem. Part of me almost wonders if the only distinguishing feature of a PDF is the .pdf extension at the end of the file. In “researching” this post I have learned what I knew already; PDF boils down to being simply a container format.

Lately I have been scanning some annual reports from years past for an organization I belong to, and due to the ways xsane 0.997 that comes with Fedora 12 scans pages — which I will concede straight out of the gate I have only explored enough to get it to do what I want and to learn how it does things “its way” — the PDF file sizes are “fairly” large.

In order to find this out, I first found out about one of the quirks in xsane 0.997: Something about the settings with xsane doesn’t have it stop between pages for me to change pages; at least, I haven’t gotten around to finding where the settings in xsane are to have it pause between pages. This is important because my scanner doesn’t have an automatic page feeder. The first page of results of a google search indicate several comments about this problem, but not a solution. At first glance the second page of results is of no help.

So I end up scanning pages one at a time, and then use GhostScript to join them all up at the end to make a single PDF.

Without having added up file sizes, it was obvious that the total size of all the scanned pages at 75 dpi and in black and white was sufficiently larger than the single PDF with all the pages joined. This did not bother me since, again without having added things up, the difference didn’t seem *too* great, and I assumed that the savings were principally due to adminstrative redundancies being eliminated by having one “container” as opposed to having 25 to 30 “containers” for each individual page.

Then this week a curious thing occurred: I scanned a six page magazine article, and then separately, another two page magazine article, at 100 dpi and colour, and whaddya know, the combined PDF of each set is smaller than any of the original source files. Significantly so. In fact, the largest page from the first set of six pages is double the size of the final integrated PDF, and in the case of the second set of two pages, each of the original pages are triple the size of the combined PDF. I’m blown away.

Discussing this with someone who knows the insides of computers way more than I, I learn something: It would appear that xsane probably creates PDF’s using the TIFF format (for image quality) as opposed to what I imagine Ghostscript does when joining files, which would seem to be to do what it can to reduce filesizes, and as such in this case I imagine convert the TIFF’s inside the PDF’s into JPEG’s. A bit of googling indeed appears to associate tiffs and PDF’s when it comes to xsane; indeed a check on the “multipage” settings shows three output file formats — PDF, PostScript and TIFF. And looking in Preferences/Setup/Filetype under the TIFF Zip Compression Rate, it’s set at 6 out of 9.

So I google PDF sizing, and one result led me to an explanation of the difference between using “Save” and “Save As …” options when editing a PDF: “Save” will typically append metadata on top of metadata (including *not* replacing the expired metadata in the “same” fields!); “Save As”, well, that’s what you really want to do to avoid a bloated file since all that should be will be replaced.

Another result begins describing (what is no doubt but a taste of) the various possible settings in a PDF file, and how using a given PDF editing application, you can go through a PDF, remove some setings, correct others, etc., and reduce the size of PDF’s by essentially eliminating redundant or situationally irrelevant — such as fields with null values — information whose presence would have the effect of bloating the file unecessarily.

I’ve known for a few years that PDF’s are a funny beast by nature when it comes to size: For me the best example by far used to be the use of “non-standard fonts” in the source file, oh say any open-source font that isn’t in the standard list of “don’t bother embedding the font since we all know that nine out of ten computers on the planet has it”. In and of itself this isn’t a problem; why not allow for file size savings when it is a reasonable presumption that many text PDF’s are based on a known set of fonts, and most people have said known set of fonts installed already on their system. However, when one uses a non-standard font or uses one of the tenth computers, when one constantly creates four to 6 page PDF text documents ten times the size of source documents, frustration sets in; having wondered if designating a font substitution along the lines of “use a Roman font such as Times New Roman” when such a font is used — such as in my case, Liberation Serif or occasionally Nimbus Roman No9 L — I asked my “person in the know”. Apparently, Fedora 12’s default GhostScript install, whose settings I have not modified, seems to do just that.

I guess what really gets me about this is how complicated the PDF standard must be, and how wildly variable the implementations are — at least, given that Adobe licences PDF creation for free provided that the implementations respect the complete standard — or more to the point, how wildly variable the assumptions and settings are in all sorts of software when creating a PDF. I bet that were I to take the same source and change one thing such as equipment or software that the results would be wildly different.

So, concurrent to the above scanning project, I happened to experiment with a portable scanner — a fun challenge in and of itself to make it work, but it did without “too much fuss”. And I found out something interesting, which I knew had nothing to do with PDF’s but (I presume) rather with scanners, drivers, and xsane. I tried scanning some pages of one of the said annual reports with the portable scanner on an identical Fedora 12 setup using xsane, and the PDF’s that were produced were far greater in size than those scanned with my desktop flatbed scanner. My flatbed scanner would scan the text and the page immediately surrounding the text, but correctly identified the “blank” part of the page as being blank, and did not scan in those areas, thereby significantly reducing the image scanned size. The other scanner, a portable model, did no such thing and created images from the whole page, blank spaces rendered, in this case, to a dull grey and all, thereby creating significantly larger PDF files than the scans of the same pages created on my flatbed scanner. However, as I mentioned, I assume that this is a function of the individual scanners and their drivers, and possibly how xsane interacts with them, and in my mind is not a function per se of how xsane creates PDF files.

Another interesting lesson.

AT&T does it again! (AKA Will I Ever Learn?)

So I just turned on my TV and here’s a commercial … family dinner … It’s Mom’s tablecloth … Back in the day my grandmother made this for me, they don’t make them like they used to anymore … pass the spaghetti … OOOPS! — NO, WAIT! Don’t do anything!

And they all naturally go to the net to look for a solution (peroxide and something else, everyone in internet cafés and schools around the world yell at their computer screens.) And what does the computer screen look like?

A vague resemblance to the Gnome desktop under Ubuntu, with the white toolbars on top and bottom with hints of brown here and there, but it’s just a touch too blurry to identify it as anything other than NOT Windows, and that it’s probably MovieOS.

I guess that every time they shoot a commercial, the geeky “I use linux at home, I’d love to have the bragging rights to *that* computer in the TV commercial” IT guy in the back is on their day off, or they don’t want to give Gnome or KDE a financial nod. Yet they want to go to the trouble of avoiding an MS or Apple desktop. Interesting.

(sigh …)

19 months, 16 *successful* installs

I just did a tally of all the installs I’ve done on my personal systems since the end of June, 2008, when I bought a new-to-me desktop and took advantage of the opportunity to upgrade from the CentOS 4.x series (to the CentOS 5.x series. 🙂 ). And I was a bit blown over; unfortunately, not surprised, but blown over nonetheless.

Over 5 systems, I’ve done 16 successful installs; then there were a few dud installs that had to be restarted right away, although a couple of said duds were counted because the installs were actually useful for a few weeks, including one not counted as a successful install during the most recent cycle despite the fact that it was a successful install; unfortunately, the boot sector on the drive died (it was to be expected, back last June or July, Palimptest identified the drive as having a fatal error on it, and the drive was declared as having about 6 months to live, and whaddya know!) so I had to get another “new” drive, which I happened to have handy, and do another install.

To be fair, there has been one factory sealed new system thrown in there (what fun to wipe the Windows install, which curiously, apparently irreparably froze up after all the updates were done, the whole thing to be able to say “yeah, but Windows worked on it!” — which it didn’t!), another system that just about hasn’t been used since and after a few months has now been removed from the upgrade cycle, another system that finally died or at least on whose ghost I have given up, and a finally a replacement system for said “death in the family”.

One of the reasons why I always say “I’d love to go back to CentOS if it weren’t so hopelessly obsolete” is that it’s stable and has a long life to it (something like typically 7 years) — Fedora *has* been good to me since I started using it from version 9, and hence with CentOS you don’t have to upgrade every six months like with Fedora — oops, that’s every 12 months or so — given the support cycle (wink wink). 🙂

Problem is that when you have several systems, you’re still doing installs every 6 months or so if the systems aren’t in sync with each other; further one of the consequences of using second hand or third hand computers, buying new computers, upgrading parts and hard drives, and even trying out another distro at least once is that your systems are hardly every likely to be in sync for the whole 13 months or so lifespan of a new-version-released-every-6-months distro like Fedora. And of course, that someone who would like to avoid having to do new installs every 6 months is going to upgrade a system that is out of sync to bring it in line with the others in the hopes that “this will be the cycle when I get to enjoy the whole lifespan and not have to upgrade 6 months from now”.

Hence the ideal of trying to avoid the “install every 6 month habit” by syncing the installs with each other when a single new install is done in order to hopefully avoid having to reinstall in 6 months is fallacious when you have at least two systems — in fact, you end up doing the opposite since you not only are installing (or re-installing) at least once every six months for one legitimate reason or another, but you end up doing multiple installs, many of which are unnecessary in and of themselves, every 6 months, just to keep everything in sync. And as such, the “install every 6 month habit”.

Of course, I have often been enjoying the process despite myself; in fact, I’ve managed to put together an ever-increasingly long list of steps to take from start to finish when installing a system (which I’ll be presenting to one of the local LUGs in a few weeks.) Fortunately, my computers are purely home desktops or hobby servers without any critical processes on them, and my brother at least humours my habit by doing those little bits that are still beyond my ever-increasing sysadmin skill set (which of course is growing with each install cycle). And in the process I’m gaining a practical appreciation for what I’ve known all along since I started using Linux in 2006 and started using CentOS: The likes of Fedora and Ubuntu may be great, but you have to re-install every 6 months! Who wants to do that?!?!” (Apparently, I do. 🙂 )

It must be interesting having multiple production servers with multiple versions of a given distro, let alone more than one distro (ie. a mix of CentOS, Debian, SuSE, and for some good fun, Slackware). Good thing that usually having “the latest and greatest” usually isn’t as particularly important on a server so that it can actually have a useful life. Must be hard for the likes of Red Hat, for instance, when it must add new drivers all the time, but in order to keep from breaking compatibility and adding “bad change” into the distro other things don’t happen (things like the HPLIP version that is one incremental subversion or whatever it 0.x increments are called behind the minimum requirements for my 2.5 year old printer, and which has since gone through several such incremental upgrades and at least a whole version upgrade since.)

News Flash — Linux spotted in the wilds of Montreal!

This morning I did something very unusual, for me. I took the commuter train into work instead of driving my car, and I saw a Gnome desktop on someone’s laptop computer! Doing a double take, I checked, and whaddya know, it’s definitely a Gnome desktop, it’s very familiar, it isn’t brown, and yup, it was Fedora 12.

A few weeks ago my brother had posted on SlashDot asking if anyone had seen Linux in use in the wild — not data centres, of course, nor at LUG meetings or other such gatherings of Linux types where of course Linux is expected to be seen, but random, innocent spottings in places like at restaurants, café’s, university or college student halls, on the streets, on the train, etc. The responses were an underwhelming (or disappointingly overwhelming) “no”. In fact, my brother said that I was the only person he knew who used a Linux desktop besides himself, and that I’m far more pure about it than he. (In fact, he uses Windows as regularly as Linux on his personal systems, while I “only” use Windows at work, and don’t particularly care for it.) Besides seeing Linux desktops at LUGS and Linux Meetups, offhand I can only think of two people I know who say they use Linux at home as their desktops.

I started chatting with this person, and they apparently develop software for a particular industry (no, not that industry), to be used on Red Hat 5.x servers; they use Fedora because CentOS is hopelessly out of date for things like wireless support on his computer; however, unfortunately, they have been finding Fedora 12 unstable … not my experience so far.

Suffice it to say that even put aside the Fedora part, this chance meeting made my day!

Fedora 12 installed — I’m a linux addict with an install every 6 months habit

Well over the past couple of weeks I’ve just installed Fedora 12 on three systems — mainly because I got a great great great new P4 3.0GHz home server, which I have been considering using as my desktop while using my current desktop as the server, a P4 2.8GHz.

To my dismay I have done this 6 months after I made a point of having the same version of Fedora on all my computers so as to avoid the “reformat a system every six months” treadmill that I was on by having different versions, because, well, my old server died and of course there was no point to putting a 6 month old version of Fedora on which I would only *have* to change 6 months from now, anyway … Sigh, the bliss of using CentOS, were only it not so completely obsolete, I would love to use it again … However on the other side, Fedora is the crack cocaine of “latest and greatest”, so for the moment there’s no going back!

All of this started back in, what, September, if not before; I couldn’t get the 80 gig drive and the 500 gig to play nice together, or so I thought. There *was* an issue with different spin speeds, but wait folks, there’s more. When I *did* have the 500gig as the boot disk, something seemed off with the amount of available storage, although I wasn’t fully aware. When I finally brought the 500 back as the boot drive, the installation went well several times with Fedora, then with Centos 5.1 (which would have been promptly updated upon reboot.) Except, the first reboot wouldn’t work, the system would freeze, and the keyboard buffer would fill up real quick. Forums were of little help, with sufficient dead ends and apparent red herrings. Finally, I started figuring out on my own that the BIOS was way too old to recognize such a large drive, and flashing it with a “new” bios would have required a lot of fun with WINE, which I wasn’t really wanting to get into using a live CD.

Christmas and a new server came along, and I’m up and running with a desktop upgraded from june 2008 — CentOS 5.1 to 5.2, July 2008 some version of Ubuntu, December 2008 Fedora 10, July 2009 Fedora 11, and now January 2010 Fedora 12 … and a netbook, a laptop which is no longer used, an old server, and a new server following a similar route for much of the way each. So much for even taking advantage of Fedora’s “1 month after the release of the second version following” … I’m still upgrading every 6 months!

As a result, though, I finally now have refined my “to-do” list when installing a machine so that it’s not so much of a hassle, and in fact two of the three setups were not only a breeze in and of themselves but the to-do lists also made the rest of the work a breeze, too. Of course my brother told me two years ago that his list was 300+ steps long and he’d found a two year old such list, that was only about 120+ items long. My list is currently somewhere around 58 items long depending on how you count it … I wonder how long it’ll take to get to 300? 🙂

However, I had problems with the desktop right after it was installed like a breeze, the disk boot sector died (I expected it would anyway as of about 6 months ago) and funnily enough the mem stick on which the setup worked like a breeze before suddenly wasn’t cooperating. Gotta figure out what was going wrong with UNetbootin creating the ram stick images from ISO, in which, curiously, the boot image required after the disk formatting in Anaconda wasn’t being properly copied or at least activated on the ram stick.

Anyway, I think I have to work on getting the most out of the system, I bet that months from now Fedora will find a way to make me upgrade again, with that lucky number associated with it and all … 🙂

FTP clients and Windows

This week I had what I consider to be curious experiences with others regarding ftp.

I’m what I would call old school; I started using ftp about 15 years ago when the internet was barely, if at all, entering the greater public consciousness, and the piece of software I had access to, gFTP, wasn’t chargeware. Way back when, Windows didn’t even have its own integrated network stack, let alone effectively having integrated ftp abilities in Windows Explorer.

Then Windows and the Vice President came along and claimed to have invented the internet :), and people use drag & drop the way you can between two directories in a gui. I have never done it this way. Ever. Even the first time I asked IT to set up an ftp site for me for a work project, I naturally downloaded and installed FileZilla after immediately finding their instructions on how to do it through Windows Explorer confusing. So the Windows way is completely foreign and confusing to me.

The first curious experience was when I, probably naively, sent out an email to a bunch of co-workers who have access to a corporate ftp site for download of files resulting from field work, such as photos. I said that if people were having difficulty transferring files using the “windows” way of transferring files, that they could use Filezilla. The response from the project manager was “Why can’t we do it directly?” Simple response, “apologies for the confusion, just in case the way IT described it doesn’t work, here’s another way to try to directly do the transfers.” Again, I became aware that I was probably naive in presuming that the “windows” way may or may not work once outside the corporate IT network — why wouldn’t it? I just checked using the windows machine down the hall on my home network, and of course I was wrong and it works.

But I found the reaction curious anyway; maybe I was taking it too literally, but I found the “why can’t we do it directly?” part curious since I was proposing an alternate DIRECT way of doing things.

The second curious experience was more concrete. A co-worker was beside me, and we’re discussing the contents of the FTP site. I fire up FileZilla, and they ask me what I’m doing. I carefully respond; they say they’ve never seen it done that way, and it looks confusing. My turn. I explain “I’m starting up an ftp client. Here are the files on my computer on the left side, here are the files on the ftp site on the right.” “oh really? That must be a new way.” (harumppphhhh.) “Uhm, I’ve been using ftp clients that look this way for the past 15 years.”

I didn’t continue into a tyrade about how Windows has people brainwashed into thinking that they way to do anything on a computer is the “Windows Way”.

Sheesh, I don’t know how to fix my car when it breaks down. Doesn’t mean I don’t have an idea about what it looks like under the hood.

Bill Kurtis yet again

In the latest Bill Kurtis AT&T 3G commercial (something to the effect of “3G Anywhere” where he’s sitting in front of a Green Screen and all sorts of backgrounds are flashed, such as a kitchen, the park, the fairgrounds, etc.), it again seems to me that there is a white line across the top of a blue screen that is similar to a Gnome desktop.

I of course have come to accept that it’s just a fake, static imitation screenshot that is put in during post-production editing, and anyway in this case, there isn’t the taskbar at the bottom of the screen to further make me hop in glee yelling “Finally! He’s really using a linux distro this time with Gnome!”

Now what’s really getting me is that they’ve obviously put a bit of time and trouble into making up unique fake, MovieOS screenshots for each of the various commercials. Sheesh, for all the trouble AND expense they’re going to, I have to wonder why they just don’t send that money to Canonical or the Fedora Foundation, or maybe the Gnome people, as a quid quo pro “thank you” contribution to avoid any licencing issues or whatever when they ask one of the techies in back “Hey you use linux, right? Well we can’t / don’t want to / find it too expensive to get a licencing deal with either MS or Apple to have one of their computers appearing in a TV commercial, and we’re gonna make a LOT of commercials with different screenshots, can we borrow your laptop with that free OS where we won’t have to pay any licencing?”

Again, I think that the guy/girl would be scampering over as fast as possible just to get the bragging rights to “Hey you know those Bill Curtis AT&T commercials? He was showing *my* computer! Cool, eh?”

XFCE

Well last week I was in a bit of an experimentation mood. Now that I have a new laptop, my old laptop, a PIII 450 with 320megs of RAM, can be used for experimentation.

In the meantime, it has slowed down and over the past year video has been anywhere from just fine to sketchy, depending on the complexity of the video.

So I removed the Gnome desktop, figuring that the memory footprint had probably been at the cusp of affecting other processes. Curiously, this didn’t cause the computer to explode and it seemed as though as long as I didn’t turn off the computer, I would have been able to continue indefinitely with Gnome.

And of course, I turned off the computer.

This was not a good thing, the boot up sequence appeared to freeze. Fortunately the single user mode was available — hey, they changed the key sequence in F11 from control-alt-F1 to control-alt-F3, and even the exit sequence changed from control-alt-F7 to control-alt-F1! However, getting my wireless to work was not obvious, and I learned just how much of wireless seems to be controlled by the desktop. So connecting to the repos to install XFCE was not possible.

I wonder for about a week when my brother suggests “well just plug in your network card and use a wired connection.” Duh, this works. 🙂 Yum install group XFCE works, and things are humming. (In between he suggests “don’t worry about getting the wireless up until the desktop is installed” … until he suggests using a wired connection, this was like telling someone stranded with an empty gas tank on the side of a desert road to drive to a gas station to get more gas!)

The installation went like a breeze, and while there are differences, the whole experience seems very similar to the Gnome desktop experience.

My brother was impressed that it seemed so easy and seamless, he said that a years ago this easiness factor would have been unheard of.

A quick test of video while I was doing a yum update — an unfair test on this machine in this day and age — showed a slight amount of choppiness. I expect that things will go well when I test again and nothing else is going on.