Google Maps seems to need to learn that some streets go East AND West

I think that Google Maps is overlooking a basic function: In the real world, people sometimes go east, and sometimes go west.

Yesterday for the third time in a couple of years I relied upon Google Maps for directions and was sent to the wrong place. Caveat Emptor strikes again.

In Montreal, east-west streets which bisect St. Laurent Boulevard (which, no surprise, goes sort of north-south), start their numbering in both east and west directions from there. Hence you can have two equally valid addresses on a given street, given the proviso that one is designated as “East” and the other “West”. (Hey! It’s Captain Obvious!)

Fortunately, the address I was looking for was 151; during an hour of going around the neighbourhood looking for parking around “151 Laurier” (East as proposed by Google Maps), I found out that that address wasn’t a dépanneur that sells a huge variety of microbrewery beers, and looked like it never was, and finally decided to go further down the street looking for similar businesses. I suddenly had a V-8 moment and realized “Ooops what about 151 Laurier WEST?” I high-tailed it in the opposite direction and found the business in question. And to my disappointment, they were out of the particular beer I was seeking — Weizenbock, by La Brasserie Les Trois Mousquetaires, which has replaced my previous definition of ambrosia, Trois Pistoles by Unibroue.

Twice before I have had similar experiences:

About a year ago, while in Western Canada in completely unfamiliar territory on a business trip, I had looked up a client’s address, and not knowing about any local east/west splits that addresses on the Trans-Canada Highway may have in that locality, I tried to find the address, on the east end of town, that Google Maps had provided; I was about 45 minutes late by the time I finally managed to suspect that my client’s address was a “West” address and got there.

And just to quash any participant in the Peanut Gallery out there about to say “Aha well when using Google Maps you should know that in such cases they’ll always send you to the East address, so be sure to always check both!” a couple of years ago I had looked up a local address for client, and Google sent me to Gouin Boulevard West here in Montreal, a solid 45 minute drive away from my client’s Gouin Boulevard East address.

Now the Peanut Gallery may have a point: In the real world, people sometimes go east, and sometimes go west. And when it comes to using a free online service, you get what you paid for. As such, when looking up an address on any online service, one should notice “Hmmm this is an east-west street which may bisect such and such a street and as such have East addresses and West addresses; I should specify both east and west in my address search.”

But I wonder how many other people place enough faith in Google that under such circumstances — such as when they don’t know that there’s an East and West of a given street — they would reasonably expect in the case that a street has valid East addresses and valid West addresses (and likewise for North and South addresses) that Google’s response page would come back with “Did you mean (A) 151 Laurier East, or did you mean (B) 151 Laurier West?” Certainly Google seems good enough at asking such a question when you slightly misspell a street or city name, or decides that it doesn’t recognize the address you supply and provide you with half a dozen options, as often spread across the country as spread across the city.

Cool (or mundane) computer trick impresses co-worker

I managed to impress someone at the office this week with a cool (read mundane) computer trick.

I got a call from the secretary, who is a few seconds’ walk from my desk, asking for a scanned version of my hand-written signature. I replied that on my computer at home I have it, and I could easily get it within a few minutes; she replies that it would be faster for her to just walk over with a piece of paper for me to sign, which she would then scan and play around with.

And this is where I began to impress her: By the time she got to my desk with said sheet of paper, I had already VNC’d into my home server’s desktop and was in the process of doing the same from the server to my main computer’s desktop (gotta finish the process of giving it a static IP and setting it up so that I don’t have to go through my home server. 🙂 ) I finished logging into my desktop, and looked in the likely directory, and voilà ! I fired up my home email client, and within a couple of minutes, she’d received my scanned signature.

Beyond the fact that the Gnome desktop is set up standard to do VNC — and the fact that I installed TigerVNC instead of using the standard Gnome Remote Desktop Viewer — too bad that I can’t really claim that this is a cool Linux trick, since my computer at work is Windows, and you can set up Windows boxes to “pick up the phone” too ….

She was still impressed, though. And it took about as much time as the whole process of signing a piece of paper, scanning it, cropping it, etc.

PDF’s, Scanning, and File Sizes

I’ve been playing around with PDF’s for the past few weeks and have noticed a very interesting thing: A PDF is *not* a PDF is *not* a PDF is *not* a PDF, ad nauseum, and it would seem, ad infinitum. At least, so it would seem. Part of me almost wonders if the only distinguishing feature of a PDF is the .pdf extension at the end of the file. In “researching” this post I have learned what I knew already; PDF boils down to being simply a container format.

Lately I have been scanning some annual reports from years past for an organization I belong to, and due to the ways xsane 0.997 that comes with Fedora 12 scans pages — which I will concede straight out of the gate I have only explored enough to get it to do what I want and to learn how it does things “its way” — the PDF file sizes are “fairly” large.

In order to find this out, I first found out about one of the quirks in xsane 0.997: Something about the settings with xsane doesn’t have it stop between pages for me to change pages; at least, I haven’t gotten around to finding where the settings in xsane are to have it pause between pages. This is important because my scanner doesn’t have an automatic page feeder. The first page of results of a google search indicate several comments about this problem, but not a solution. At first glance the second page of results is of no help.

So I end up scanning pages one at a time, and then use GhostScript to join them all up at the end to make a single PDF.

Without having added up file sizes, it was obvious that the total size of all the scanned pages at 75 dpi and in black and white was sufficiently larger than the single PDF with all the pages joined. This did not bother me since, again without having added things up, the difference didn’t seem *too* great, and I assumed that the savings were principally due to adminstrative redundancies being eliminated by having one “container” as opposed to having 25 to 30 “containers” for each individual page.

Then this week a curious thing occurred: I scanned a six page magazine article, and then separately, another two page magazine article, at 100 dpi and colour, and whaddya know, the combined PDF of each set is smaller than any of the original source files. Significantly so. In fact, the largest page from the first set of six pages is double the size of the final integrated PDF, and in the case of the second set of two pages, each of the original pages are triple the size of the combined PDF. I’m blown away.

Discussing this with someone who knows the insides of computers way more than I, I learn something: It would appear that xsane probably creates PDF’s using the TIFF format (for image quality) as opposed to what I imagine Ghostscript does when joining files, which would seem to be to do what it can to reduce filesizes, and as such in this case I imagine convert the TIFF’s inside the PDF’s into JPEG’s. A bit of googling indeed appears to associate tiffs and PDF’s when it comes to xsane; indeed a check on the “multipage” settings shows three output file formats — PDF, PostScript and TIFF. And looking in Preferences/Setup/Filetype under the TIFF Zip Compression Rate, it’s set at 6 out of 9.

So I google PDF sizing, and one result led me to an explanation of the difference between using “Save” and “Save As …” options when editing a PDF: “Save” will typically append metadata on top of metadata (including *not* replacing the expired metadata in the “same” fields!); “Save As”, well, that’s what you really want to do to avoid a bloated file since all that should be will be replaced.

Another result begins describing (what is no doubt but a taste of) the various possible settings in a PDF file, and how using a given PDF editing application, you can go through a PDF, remove some setings, correct others, etc., and reduce the size of PDF’s by essentially eliminating redundant or situationally irrelevant — such as fields with null values — information whose presence would have the effect of bloating the file unecessarily.

I’ve known for a few years that PDF’s are a funny beast by nature when it comes to size: For me the best example by far used to be the use of “non-standard fonts” in the source file, oh say any open-source font that isn’t in the standard list of “don’t bother embedding the font since we all know that nine out of ten computers on the planet has it”. In and of itself this isn’t a problem; why not allow for file size savings when it is a reasonable presumption that many text PDF’s are based on a known set of fonts, and most people have said known set of fonts installed already on their system. However, when one uses a non-standard font or uses one of the tenth computers, when one constantly creates four to 6 page PDF text documents ten times the size of source documents, frustration sets in; having wondered if designating a font substitution along the lines of “use a Roman font such as Times New Roman” when such a font is used — such as in my case, Liberation Serif or occasionally Nimbus Roman No9 L — I asked my “person in the know”. Apparently, Fedora 12’s default GhostScript install, whose settings I have not modified, seems to do just that.

I guess what really gets me about this is how complicated the PDF standard must be, and how wildly variable the implementations are — at least, given that Adobe licences PDF creation for free provided that the implementations respect the complete standard — or more to the point, how wildly variable the assumptions and settings are in all sorts of software when creating a PDF. I bet that were I to take the same source and change one thing such as equipment or software that the results would be wildly different.

So, concurrent to the above scanning project, I happened to experiment with a portable scanner — a fun challenge in and of itself to make it work, but it did without “too much fuss”. And I found out something interesting, which I knew had nothing to do with PDF’s but (I presume) rather with scanners, drivers, and xsane. I tried scanning some pages of one of the said annual reports with the portable scanner on an identical Fedora 12 setup using xsane, and the PDF’s that were produced were far greater in size than those scanned with my desktop flatbed scanner. My flatbed scanner would scan the text and the page immediately surrounding the text, but correctly identified the “blank” part of the page as being blank, and did not scan in those areas, thereby significantly reducing the image scanned size. The other scanner, a portable model, did no such thing and created images from the whole page, blank spaces rendered, in this case, to a dull grey and all, thereby creating significantly larger PDF files than the scans of the same pages created on my flatbed scanner. However, as I mentioned, I assume that this is a function of the individual scanners and their drivers, and possibly how xsane interacts with them, and in my mind is not a function per se of how xsane creates PDF files.

Another interesting lesson.

Fedora 12 installed — I’m a linux addict with an install every 6 months habit

Well over the past couple of weeks I’ve just installed Fedora 12 on three systems — mainly because I got a great great great new P4 3.0GHz home server, which I have been considering using as my desktop while using my current desktop as the server, a P4 2.8GHz.

To my dismay I have done this 6 months after I made a point of having the same version of Fedora on all my computers so as to avoid the “reformat a system every six months” treadmill that I was on by having different versions, because, well, my old server died and of course there was no point to putting a 6 month old version of Fedora on which I would only *have* to change 6 months from now, anyway … Sigh, the bliss of using CentOS, were only it not so completely obsolete, I would love to use it again … However on the other side, Fedora is the crack cocaine of “latest and greatest”, so for the moment there’s no going back!

All of this started back in, what, September, if not before; I couldn’t get the 80 gig drive and the 500 gig to play nice together, or so I thought. There *was* an issue with different spin speeds, but wait folks, there’s more. When I *did* have the 500gig as the boot disk, something seemed off with the amount of available storage, although I wasn’t fully aware. When I finally brought the 500 back as the boot drive, the installation went well several times with Fedora, then with Centos 5.1 (which would have been promptly updated upon reboot.) Except, the first reboot wouldn’t work, the system would freeze, and the keyboard buffer would fill up real quick. Forums were of little help, with sufficient dead ends and apparent red herrings. Finally, I started figuring out on my own that the BIOS was way too old to recognize such a large drive, and flashing it with a “new” bios would have required a lot of fun with WINE, which I wasn’t really wanting to get into using a live CD.

Christmas and a new server came along, and I’m up and running with a desktop upgraded from june 2008 — CentOS 5.1 to 5.2, July 2008 some version of Ubuntu, December 2008 Fedora 10, July 2009 Fedora 11, and now January 2010 Fedora 12 … and a netbook, a laptop which is no longer used, an old server, and a new server following a similar route for much of the way each. So much for even taking advantage of Fedora’s “1 month after the release of the second version following” … I’m still upgrading every 6 months!

As a result, though, I finally now have refined my “to-do” list when installing a machine so that it’s not so much of a hassle, and in fact two of the three setups were not only a breeze in and of themselves but the to-do lists also made the rest of the work a breeze, too. Of course my brother told me two years ago that his list was 300+ steps long and he’d found a two year old such list, that was only about 120+ items long. My list is currently somewhere around 58 items long depending on how you count it … I wonder how long it’ll take to get to 300? 🙂

However, I had problems with the desktop right after it was installed like a breeze, the disk boot sector died (I expected it would anyway as of about 6 months ago) and funnily enough the mem stick on which the setup worked like a breeze before suddenly wasn’t cooperating. Gotta figure out what was going wrong with UNetbootin creating the ram stick images from ISO, in which, curiously, the boot image required after the disk formatting in Anaconda wasn’t being properly copied or at least activated on the ram stick.

Anyway, I think I have to work on getting the most out of the system, I bet that months from now Fedora will find a way to make me upgrade again, with that lucky number associated with it and all … 🙂

FTP clients and Windows

This week I had what I consider to be curious experiences with others regarding ftp.

I’m what I would call old school; I started using ftp about 15 years ago when the internet was barely, if at all, entering the greater public consciousness, and the piece of software I had access to, gFTP, wasn’t chargeware. Way back when, Windows didn’t even have its own integrated network stack, let alone effectively having integrated ftp abilities in Windows Explorer.

Then Windows and the Vice President came along and claimed to have invented the internet :), and people use drag & drop the way you can between two directories in a gui. I have never done it this way. Ever. Even the first time I asked IT to set up an ftp site for me for a work project, I naturally downloaded and installed FileZilla after immediately finding their instructions on how to do it through Windows Explorer confusing. So the Windows way is completely foreign and confusing to me.

The first curious experience was when I, probably naively, sent out an email to a bunch of co-workers who have access to a corporate ftp site for download of files resulting from field work, such as photos. I said that if people were having difficulty transferring files using the “windows” way of transferring files, that they could use Filezilla. The response from the project manager was “Why can’t we do it directly?” Simple response, “apologies for the confusion, just in case the way IT described it doesn’t work, here’s another way to try to directly do the transfers.” Again, I became aware that I was probably naive in presuming that the “windows” way may or may not work once outside the corporate IT network — why wouldn’t it? I just checked using the windows machine down the hall on my home network, and of course I was wrong and it works.

But I found the reaction curious anyway; maybe I was taking it too literally, but I found the “why can’t we do it directly?” part curious since I was proposing an alternate DIRECT way of doing things.

The second curious experience was more concrete. A co-worker was beside me, and we’re discussing the contents of the FTP site. I fire up FileZilla, and they ask me what I’m doing. I carefully respond; they say they’ve never seen it done that way, and it looks confusing. My turn. I explain “I’m starting up an ftp client. Here are the files on my computer on the left side, here are the files on the ftp site on the right.” “oh really? That must be a new way.” (harumppphhhh.) “Uhm, I’ve been using ftp clients that look this way for the past 15 years.”

I didn’t continue into a tyrade about how Windows has people brainwashed into thinking that they way to do anything on a computer is the “Windows Way”.

Sheesh, I don’t know how to fix my car when it breaks down. Doesn’t mean I don’t have an idea about what it looks like under the hood.

ASP and Windows centric web pages slow

I am at another hotel on business. (Ho-hum, they have a password, I don’t care at this point to find out how long it’s been in place, I’m sure it’s good odds that it’s been a while. Shall we say that it’s named after a good ship. I suppose I could be wrong.)

Surfing is, often enough, slow. A good number of pages hang and time out. At first, I don’t notice much because the main one I visit is always slow, always hangs and occasionally times out.

At first I was wondering if it’s because I’m in a hotel using their internet — you know, bottlenecks due to lots of people using the internet hookup at the same time (what, at 6AM?), people setting up repeaters in the bushes stealing signal because there is an insecure password that at worst would cost them a night’s stay to figure out, etc.

I also have the company laptop with me, to do company work (of course, I have my own laptop to do personal stuff, the company policy on personal usage of the computers is getting to be much like closed source licences that make you wonder whether you may use the software at all, even for its apparently intended purpose.) It uses windows on a Centrino Core 2 with something like 2.8GHz or more. For the fun of it I type in le web page du jour. It loads quite speedily, while the page load on the same web page on my F11 ACER Aspire One is still hanging.

And I notice something interesting: le web page du jour is in ASP. So is the historically slow site. Last night the site that was impossible to properly log into using my laptop, the work email server — such that I logged into my own web email to send the message to the home office — is, you got it, on a windows server. A fourth site this morning timed out in the middle of a survey I agreed to take; I hope it’s a linux server, it’s for a magazine I subscribe to on a little topic called linux (the publishing house also has a PC magazine, so go figure.)

I’m wondering: Am I having difficulties with these pages because I’m using Firefox? Linux? A slower machine? Is it Fedora’s implementation of Firefox 3.5 Beta 4 or whatever? Some combination of the above? Is this an ASP compatibility problem? Or an ASP discrimination problem? Or are the pages in question themselves biased against non-windows computers? Or non-IE browsers? (Never heard that one before!) (here’s my archive)

À suivre …

The new Google OS

Well for those of you who haven’t heard, internet darling Google announced in the past day or two that it will be releasing a new OS expected in 2010 (here’s my archive).

I had a few reactions:

– Google getting headline news should make it interesting, and they have the money and clout to be a real competitor. I saw the news about the new Google OS by watching the morning news and one of the taglines was “Google to launch operating system”. Sorry, Ubuntu only gets headline news within the likes of gearheads like me (see below), and it’s a footnote at best when people talk about that South African Space Tourist.
– What will “it” be? Linux? Google-Hurd? Open-source? GPL? BSD-licence? Apache Licence?
– I wondered what it would be about. Goobuntu? Ahhh, it’ll be an internet-centric linux distro — meaning, even though it’s obvious that it’s meant to be a MS-killer on the netbooks (with the possibility that it could be released, with appropriate changes, for the desktop too), its main comparison will be gOS. (Insert tongue firmly in cheek here. Then bite.)
– It’ll only have any way of working if it A) deals with the problems Fedora has out of the box (flash, mp3, avi, DVD, etc.) by no doubt including such support out of the box, and generally be AS GOOD POINT FOR POINT as MS, and then some, and B) do something better than MS — and be something that people want.
– It’ll have to likely change the computing paradigm. The cloud computing paradigm has been touted for about seven years or more now and has only been taking off in the past year or so. Google has been slowly eroding MS with things like gmail and google docs, alongside Firefox and OpenOffice.org, and generally contributing to opensource and other projects, but I’m wondering when the breakpoint will be when suddenly EVERYBODY drops MS and goes somewhere else, or rather the pie becomes properly split up such that what’s under the hood matters less than what goes on on the screen. Oh, and people don’t like change. Resistance to change is one of Open Source’s, no scratch that, “any alternative to MS”‘s biggest enemies.
– My original take on the above was that Google *would* be the people able to push things beyond the breakpoint.
– I’m wondering if it will have to go on par with MS by pulling a Novell to integrate MS-files nicely.
– Ahh, “machines with it will be sold starting in 2010” — it was but also wasn’t as specific as that … will there be a slice in it for Google? Or will there given away the way other distros are, but have insidious settings that encourage the user without realizing it to go to some web page that has google click ads? Or … what’s in it for them?

Then of course, I’m listening to one of these “deal to the lowest common denominator then add 2 points of intelligence” syndicated talk radio hosts who’s got a guest talking about this subject. To set the stage, the previous topic he discussed was a videoclip on YouTube of a person using both hands to shave his head while driving and whether there should be a law against such a thing, which he caps up with the likes of “there should be an anti-moronification law against such morons.”

To be fair, the stance he and his guest take is targeted at most people who inexplicably (to me, anyway) have no clue that there *is* an (easy) alternative to MS on the PC, besides the Mac, which he rightfully puts in a class of its own. And, Linux *is* mentioned as an available alternative, but “it’s pretty much for the gearheads”.

Here’s what I sent him, I was so riled up:

*****

Forget Ontario hair-shaving idiots making the roads less safe, I wonder about those on the radio who say linux is for “gearheads”.

I suppose I’m a gearhead, I do indeed like computers for their own sake beyond the day to day usefulness they present.

However I’ve been using various versions of linux for the past several years on my PC and take great pleasure in overwriting any existing MS format on any new computer I get — over the past three years, that’s about 5 computers, formatted a few times over on some. Some are older and more archaic than the netbooks your piece mentioned, let alone today’s top of the line desktops, and I’ve been using them for desktop uses, not server applications. On them are full OS’s that are not stripped down — unless, of course, I were to have chosen one of the minimalist versions — and interestingly are not all that slow.

There are several versions which are geared toward the “average” user. Most of the more common versions can do all of the day to day uses that were mentioned in your piece and are on par with — sometimes superior to — MS. I use a version that is a cross between the “gearhead” market and day to day usage. I recommend to newcomers Ubuntu, which I do not use. Virtually all users of MS however would be able to use Ubuntu, available at ubuntu.com, with no difficulty, and it is the most popular of the linux versions and is not aimed at the “gearheads”.

I was incredulous listening to the show to hear that people still think that MS is the only option for their PCs. I suppose that the few who have heard of linux figure that something given away for free is worth the money paid for it. Au contraire, MS is less configurable and as you know virus prone as compared to linux; for the virus part, you have to pay more to get properly protected. Linux on the other hand is safer, faster, and free compared to MS.

I found your guest informative but I found the bias toward linux not being a competitive alternative on the desktop — which it has been for years — compared to Windows “very interesting”.

*****

Oh, I do think that the driver in Ontario is a complete moron. 🙂

And Mr. Shuttleworth, please note that I *will* recommend Ubuntu to the general public since the learning curve is easier than even Fedora’s.

Well Hallelujah! Big Brother has finally acted!

I just checked into a motel today that not only has a password for its internet access, and not only a good, secure password, but — get this — it was automatically generated when I checked in and asked about internet access, and that everyone gets an unique password. They told me that it has an expiry time that I could choose when I first logged on; if I mistakenly chose one too short, I just would have to call the front desk for a new one. I know it’s unique to “me” because I’m using two laptops on this trip, my company machine for company business, and my personal laptop for personal stuff, and when I tested the original password on my personal machine after having used my company machine, I was told that the limit of accounts attached to the password had been reached, so I called the front desk and they gave me another one, no questions asked.

Some would say that the unique passwords could be a violation of my privacy. Possibly, if they happened to tie the password to my room; I don’t know if they have, but in this case … well, the conspiracy theorist in me has not been awakened (I know, famous last words. 🙂 ) I suppose I’m not fond of the notion that the unique password could be used to against me, including wrongly and/or maliciously.

However, I suspect that I’m safe; hotels are generally in the business of being discrete as part of their profit motive, so keeping track of who gets which password or derivative information about my internet use is, well, not generally in their interest. Then again, perhaps I should also be worried about the key card hotels give me and whether they are keeping track of how often I go in and out of the room, or use the pool, hot tub, exercise room or laundry room protected by the key cards. Ooops, they took down my name, address, and phone number, and they have credit card information linked to me when I signed in! My privacy has already been thrown out the window!

I shouldn’t be — I figure that this should be the norm — but I’m sincerely impressed that this hotel protects its business resources much the same way they they protect the others, such as the aforementioned rooms, pools, hot tubs, exercise rooms, laundry rooms, and the like.

Hotel internet access passwords — Here’s a case for Captain Obvious

In the past 4 weeks I have spent as much time in my own bed at home as I normally spend in hotels for business over 6-12 months (about 4 nights.) As such I have been using a few hotel internet hookups.

First, the good news from Captain Obvious is that most hotels and motels in North America these days seem to have wireless internet. I know, I’ve been using hotel/motel wireless for about three years now, but now it’s so commonplace, that my experience last fall near Boston at an otherwise charming New England Inn where they used a large group of computer savvy geocachers as guinea pigs for their new wireless internet setup seems odd.

Next, to set the stage, Captain Obvious is observing that several years ago when people started getting wireless routers in their homes and offices and anywhere else, after a while people learned that they had to lock down their router with a password that nobody knows except, well, those with whom they wanted to share their bandwidth (and only those with whom they wanted to share their bandwidth), so that, well, you know, the neighbours don’t decide to save a few bucks on a cable modem and piggyback on yours. Or use so much of it that you start getting overage charges, assuming that their theft of your signal doesn’t significantly affect your use of your internet. Or that strange looking people don’t park in front of your house for hours on end for no apparent reason. Or worse yet, do so while doing things that would have the cops knocking down your door for doing things like, oh, who knows, downloading kiddie porn or spamming or hacking into financial institutions and stealing large sums of money.

Which makes me wonder about the all the hotels I’ve been staying at over the past month:

– The first one had a great, really secure password, that hadn’t been changed in over a year and a half.

– The next one used the hotel’s name with a few numbers added to the end. I don’t expect that they change it very often, if at all; I’ve used that hotel a couple of times over the past three weeks and I haven’t been told that the password has changed. The signals just work fine, and at the end of the week I think it’s even money that it still won’t have changed when I likely will go back to that hotel.

– The next one didn’t even have a password.

– The next one used its fax number as its password. Apparently the owner just recently acquired the establishment and at least has gotten to the point of pulling the plug on the router for a few minutes whenever he notices a suspicious character in his parking lot. There were anywhere from two to four other insecure networks in range, although one called itself “free public wireless”, and the other was a nearby internet café.

– The next one has two routers without a password, and there’s another insecure signal in range.

At each place I have implored the people at the front desk to please install a password and change it at least monthly if not weekly, or even have an unique key generated for each guest or at least each hour for whoever comes in during that period (OK, this is a bit too much Big Brother, but the day may come), and not use a dictionary word; of course two of the five are staffed by employees without any real influence over such matters, and at another I suspect much the same.

In general I wonder how hard it is to have a control panel to their router to change the password, that they remind themselves to do according to a schedule they can mark on their calendar, or they can ask their IT guy to set up a script to do it automatically according to whatever schedule the innkeeper chooses, and they just need a quick reference note to show up on their guest registration screen with the “internet password”. I know, I can’t do it, but I could even figure out how to do a cron job; surely in the MS world it’s easier than that.

I congratulated the first place for their excellent password but said that after a while all someone has to do is stay a night or speak with someone who has and they can get the password. Then they could easily set up a router in the bushes nearby and a few repeaters or a wire to their house a few doors down and bingo, who cares about the cops banging down the innkeeper’s door. The cops *know* that hotel guests will often use their trips to hotels to download things they might never do at home. However, the innkeeper is paying for some thief to reduce the service he’s supplying to his customers just to remain competitive. All he needs is the bad publicity from the cops knocking down his door or those of his clients because of someone in a van with a laptop, who can drive away when the sirens are heard down the street, is conducting some illicit business using a hijacked connection.

I can put up with the nuisance of a fist-time password challenge web page at a hotel. I understand that Friday evenings to Sunday mornings — and possibly other times during the week — there are a lot of guests at a hotel and the speed is likely to be slower as a result. But I wonder about how much slower it is because some industrious person may be hidden somewhere in the bushes or the parking lot, or have a series of repeaters running down the block and slowing down the access I’m paying for through my room rate. Or temporarily losing a connection because the innkeeper is “scaring off” a suspicious character in the parking lot by unplugging his router for a few minutes.

And the notion that my door conceivably *could* be knocked down for someone in the parking lot, or at least, I could be a spectator to such a thing and still be questioned, is less than savoury.

My complaints *are* rather petty compared to world hunger. But that’s not the point: It seems to me to be a good combination of due diligence and, well, good business sense, just to change the passwords on the first of the month, or every Monday. They put key-card locks to the pool, the whirlpool, the exercise room, and even the laundry room (that is coin-operated!) to limit access to their guests.

Why not do the same for their wireless internet?

Sigh … Bill Kurtis *is* consistent

In my last post I mentioned that Bill Kurtis, in his “I’m Bill Kurtis, and I’m faster than …” commercial series, didn’t appear to be consistent with the laptops the techies were supplying him with.

Obviously, I have seen the original commercial several times since and have been paying attention.

Well apparently he has been consistent, I again demonstrated my general incompetence by, in this case, wanting to see brown (ughh!) in the form of the Gnome desktop. In fact, it was much closer to the greenish blue of the second “Andy Roddick” commercial. And, there is the disclaimer at the bottom of the screen: “Simulated images”

This bugs me a lot. I’ve said it before, I say it again: it seems that for something as simple as these commercials, the screenshot doesn’t matter; all they are trying to do is avoid having some software developer demand royalties for the screenshot, so they make a generic desktop screenshot in-house. Too bad, the techies in the back usually would love to see their laptop in a (inter)nationally broadcast commercial and have the bragging rights.

Of course, putting aside my conspiracy theory about the TV producers not wanting to pay royalties for the on-screen use of copyrighted software or the legal department not wanting to risk depicting a “real, live OS” (like it or not that’s what MS is) and making potential customers think that what is being advertised will *actually*work* with their computer (hey, there’s a new conspiracy theory for me!), there is probably a much more logical explanation, along the lines of the difference in cycles between a screen and a TV camera (flickering), or simply maybe it’s difficult to get a good screen resolution off a computer in the shot on TV.