I’m going virtual, and hitting a real brick wall

This week after reading the June 2010 issue of Linux Format I decided to do what the cover article was about, which was “Try any Distro!” “Luckily enough” I use the base distro that they recommend, Fedora, “because it has the best implementation of Virt-Manager”.

Certainly the setup of Virt Manager was easy (yum install kvm virt-manager libvirt). And the next part was as easy: I downloaded the PC-BSD net install ISO. PC-BSD because I’ve been wanting to try BSD for a while, and PC-BSD because the same issue of Linux Format happened to review it.

After that things went well: I follow the setup — 10 gigs virtual hard drive, “1” processor of two (my machine only has one, but I guess the hyperthreading is up and running, so the system identifies two processors) 512 megs of ram, etc. And I go through the easy setup. The whole things takes about 3 hours to download all the packages and do the setup. Bedtime comes around, and it’s ready to reboot. Darn, I have to go to bed on an error message: “No /boot/kernel/kernel”.

So the next evening, I decide to try OpenSolaris and OpenSuSE, the latter in the netinstall option. Things fare worse: OpenSolaris says that there’s “No bootable device”. Huh? Isn’t it supposed to boot off of the ISO so that I can go through either the install process or see the live-CD? OpenSuSE gives me the same result.

I’ll have to look into this … the second two experiences make me wonder if the PC-BSD problem is not coincidence, but I think that it coincidentally tells me that there is a problem with reading the virtual hard drives, but the problem with the second and third cases is just getting them to properly boot the ISOs …

More on the CNBC schedule

In my last post, I followed the correspondence of CNBC’s actual programming to what was announced on the electronic guide supplied on Shaw’s satellite service in Canada over the period of a long weekend. I found enough time slots whose actually broadcast programme did not correspond to the announced schedule over the weekend to make me conclude “CNBC, announce what you’ll be broadcasting, and broadcast what you announce”.

I figured that for my own edification and to actually see whether what I’d found was a fluke, just how flagrant it was, and to just get a larger sample size, I’d actually do some more in-depth “research” and gather a whole lot more data, which I present here in PDF format.

Basically the conclusion I came to from this larger data set was “Outside North America’s Eastern Time Zone’s business hours Monday to Friday, CNBC’s announced schedule isn’t particularly reliable.” (Count the number of instances that they don’t correspond.) That’s the polite, reserved conclusion, partly tempered by the fact that I really don’t know what CNBC thinks about schedule accuracy — I did send the contents of my last blog to CNBC, and was told that it was passed on to the programming department — partly tempered by the fact that often enough, while it still doesn’t trump the announced schedule, the actually broadcast shows were much more relevant to CNBC’s apparent mandate of broadcasting business news, partly tempered by the fact that during said business hours, I’m also out earning a living (and don’t watch TV during those hours), and partly tempered by a nagging feeling that something is afoot.

/side note on:

My suspicion that something may be afoot comes from the following:

In Canada, the Canadian Radio and Telecommunications Commission has rules about “Canadian Content” to protect “Canadian Culture”. (Since for the moment I’m not interested in tackling that issue, I won’t. 🙂 ) One of the consequences of this is that on cable / satellite / etc. when a Canadian channel is showing an American show at the same time as an American channel that can be viewed by the same person on the same TV — let’s say the Super Bowl is being broadcast, obviously on an American channel, and a Canadian channel carries it — the cable company must substitute the American feed on the American channel with the Canadian feed from the Canadian channel. So for instance, during the Super Bowl, while we get to watch the same game live, we don’t get to see the American commercials, not even on the American channels; these days at least we can go to YouTube the following day to see them.

As a result, in such a case, at the beginning of the broadcast on the American channel, we sometimes will see a flicker when the feed is being switched from the American feed to the Canadian feed. Over the course of the past couple of weeks, one of the things that I have occasionally though not always noticed and didn’t document, at the beginning of some of the broadcasts that weren’t as announced was this flicker, making me wonder if it’s a fluke, or if the feed is being switched for one reason or another, regardless of who’s doing it and whether or not the CRTC is involved.

/side note off

Another conclusion I came to was a confirmation of my original conclusion that the switches — with one exception — were usually not of the type where there was breaking news or some other reason that obviously would trump the announced schedule, despite the fact that sometimes what was broadcast actually seemed more relevant to CNBC’s mandate than what was announced; think of broadcasting Squawk Box, a live business news programme, instead of one of the announced aforementioned excellent business documentaries. I also found it interesting that over weekends there were a number of half-hour slots that were either “Paid Programming” or named paid programming along the lines of “Get Sexier in 90 Days”, “Insane Sexy Bodies”, or “Relieve Back Pain”, while very respectable CNBC documentaries or international financial news programmes were actually broadcast. Given that normally there would be two such half hour programs announced — usually “Paid Programming” THEN a named infomercial — and that a one hour program would be broadcast, be it a one-time documentary or an episode of “American Greed”, a combination investigative journalism / documentary program, and that such shows would often be directly announced at other times, it was obvious to me that sponsors don’t call up CNBC and say that they’d like to buy a block of time only to pay for one of these shows.

So enjoy the data. Of course I’m also sending it off to CNBC.

Ubuntu and Fedora LiveCDs — Ubuntu a clear winner!

I’m trying to convince a certain group to wipe their virus infected (and no doubt with trojan horses, key loggers, and spyware) computer over to linux, and so I’ve burned the Fedora 12 Live CD and the Ubuntu 9.10 Live CD.

I don’t want to bother giving them the Fedora Live CD. The Ubuntu CD is far too slick. And, the Fedora Live CD is far too vanilla. And that’s despite my usual rivalry with Ubuntu; at first glance, the killer is the inclusion of OpenOffice.org on the Ubuntu CD, while Fedora has the lightweight (albeit otherwise capable) AbiWord. Even the brown looks bright and welcoming, as opposed to Fedora’s more conservative, dull greyish-blue.

Add to that the directory of various files introducing Ubuntu, what it’s about, and even a sample mortgage calculator, and it’s little wonder that Ubuntu gets a whole lot of first timers straight out of the gate, or that first timers settle on Ubuntu after trying a bunch of other distros. As a marketing tool (at least for the desktop), the Ubuntu CD wins hands down; I’m not even sure that fully set up via traditional means from the DVD or full set of CD’s Fedora is this flashy.

I’ve been telling people for a while that “I use Fedora, but you’ll find Ubuntu easier”. I’ve just seen the proof. Seeing the CD, I would want to start afresh with it. I won’t of course, but I was impressed.

I’m wondering, though, which is the real killer — the inclusion of OpenOffice.org, or the directory introducing Ubuntu? I bet that were Fedora to mount a similar directory, including how to expand upon the base supplied on the CD, that people might take it up a bit more. I’m thinking of things like “Accustomed to OpenOffice.org? Go here and this is what you do.” or a “top five” “what to do once you install the Fedora base (or even just the Live-CD)” based on “Common desktop tasks”, “Setting up a home file and media server”, or the usual choices found in the standard anaconda setup.

I’m even thinking that the Ubuntu Live CD is productive — and “complete” — right away with its little directory, forget having little tutorials.

I guess that I should find out about whether or not Fedora does something like this, though … 🙂

PDF’s, Scanning, and File Sizes

I’ve been playing around with PDF’s for the past few weeks and have noticed a very interesting thing: A PDF is *not* a PDF is *not* a PDF is *not* a PDF, ad nauseum, and it would seem, ad infinitum. At least, so it would seem. Part of me almost wonders if the only distinguishing feature of a PDF is the .pdf extension at the end of the file. In “researching” this post I have learned what I knew already; PDF boils down to being simply a container format.

Lately I have been scanning some annual reports from years past for an organization I belong to, and due to the ways xsane 0.997 that comes with Fedora 12 scans pages — which I will concede straight out of the gate I have only explored enough to get it to do what I want and to learn how it does things “its way” — the PDF file sizes are “fairly” large.

In order to find this out, I first found out about one of the quirks in xsane 0.997: Something about the settings with xsane doesn’t have it stop between pages for me to change pages; at least, I haven’t gotten around to finding where the settings in xsane are to have it pause between pages. This is important because my scanner doesn’t have an automatic page feeder. The first page of results of a google search indicate several comments about this problem, but not a solution. At first glance the second page of results is of no help.

So I end up scanning pages one at a time, and then use GhostScript to join them all up at the end to make a single PDF.

Without having added up file sizes, it was obvious that the total size of all the scanned pages at 75 dpi and in black and white was sufficiently larger than the single PDF with all the pages joined. This did not bother me since, again without having added things up, the difference didn’t seem *too* great, and I assumed that the savings were principally due to adminstrative redundancies being eliminated by having one “container” as opposed to having 25 to 30 “containers” for each individual page.

Then this week a curious thing occurred: I scanned a six page magazine article, and then separately, another two page magazine article, at 100 dpi and colour, and whaddya know, the combined PDF of each set is smaller than any of the original source files. Significantly so. In fact, the largest page from the first set of six pages is double the size of the final integrated PDF, and in the case of the second set of two pages, each of the original pages are triple the size of the combined PDF. I’m blown away.

Discussing this with someone who knows the insides of computers way more than I, I learn something: It would appear that xsane probably creates PDF’s using the TIFF format (for image quality) as opposed to what I imagine Ghostscript does when joining files, which would seem to be to do what it can to reduce filesizes, and as such in this case I imagine convert the TIFF’s inside the PDF’s into JPEG’s. A bit of googling indeed appears to associate tiffs and PDF’s when it comes to xsane; indeed a check on the “multipage” settings shows three output file formats — PDF, PostScript and TIFF. And looking in Preferences/Setup/Filetype under the TIFF Zip Compression Rate, it’s set at 6 out of 9.

So I google PDF sizing, and one result led me to an explanation of the difference between using “Save” and “Save As …” options when editing a PDF: “Save” will typically append metadata on top of metadata (including *not* replacing the expired metadata in the “same” fields!); “Save As”, well, that’s what you really want to do to avoid a bloated file since all that should be will be replaced.

Another result begins describing (what is no doubt but a taste of) the various possible settings in a PDF file, and how using a given PDF editing application, you can go through a PDF, remove some setings, correct others, etc., and reduce the size of PDF’s by essentially eliminating redundant or situationally irrelevant — such as fields with null values — information whose presence would have the effect of bloating the file unecessarily.

I’ve known for a few years that PDF’s are a funny beast by nature when it comes to size: For me the best example by far used to be the use of “non-standard fonts” in the source file, oh say any open-source font that isn’t in the standard list of “don’t bother embedding the font since we all know that nine out of ten computers on the planet has it”. In and of itself this isn’t a problem; why not allow for file size savings when it is a reasonable presumption that many text PDF’s are based on a known set of fonts, and most people have said known set of fonts installed already on their system. However, when one uses a non-standard font or uses one of the tenth computers, when one constantly creates four to 6 page PDF text documents ten times the size of source documents, frustration sets in; having wondered if designating a font substitution along the lines of “use a Roman font such as Times New Roman” when such a font is used — such as in my case, Liberation Serif or occasionally Nimbus Roman No9 L — I asked my “person in the know”. Apparently, Fedora 12’s default GhostScript install, whose settings I have not modified, seems to do just that.

I guess what really gets me about this is how complicated the PDF standard must be, and how wildly variable the implementations are — at least, given that Adobe licences PDF creation for free provided that the implementations respect the complete standard — or more to the point, how wildly variable the assumptions and settings are in all sorts of software when creating a PDF. I bet that were I to take the same source and change one thing such as equipment or software that the results would be wildly different.

So, concurrent to the above scanning project, I happened to experiment with a portable scanner — a fun challenge in and of itself to make it work, but it did without “too much fuss”. And I found out something interesting, which I knew had nothing to do with PDF’s but (I presume) rather with scanners, drivers, and xsane. I tried scanning some pages of one of the said annual reports with the portable scanner on an identical Fedora 12 setup using xsane, and the PDF’s that were produced were far greater in size than those scanned with my desktop flatbed scanner. My flatbed scanner would scan the text and the page immediately surrounding the text, but correctly identified the “blank” part of the page as being blank, and did not scan in those areas, thereby significantly reducing the image scanned size. The other scanner, a portable model, did no such thing and created images from the whole page, blank spaces rendered, in this case, to a dull grey and all, thereby creating significantly larger PDF files than the scans of the same pages created on my flatbed scanner. However, as I mentioned, I assume that this is a function of the individual scanners and their drivers, and possibly how xsane interacts with them, and in my mind is not a function per se of how xsane creates PDF files.

Another interesting lesson.

19 months, 16 *successful* installs

I just did a tally of all the installs I’ve done on my personal systems since the end of June, 2008, when I bought a new-to-me desktop and took advantage of the opportunity to upgrade from the CentOS 4.x series (to the CentOS 5.x series. 🙂 ). And I was a bit blown over; unfortunately, not surprised, but blown over nonetheless.

Over 5 systems, I’ve done 16 successful installs; then there were a few dud installs that had to be restarted right away, although a couple of said duds were counted because the installs were actually useful for a few weeks, including one not counted as a successful install during the most recent cycle despite the fact that it was a successful install; unfortunately, the boot sector on the drive died (it was to be expected, back last June or July, Palimptest identified the drive as having a fatal error on it, and the drive was declared as having about 6 months to live, and whaddya know!) so I had to get another “new” drive, which I happened to have handy, and do another install.

To be fair, there has been one factory sealed new system thrown in there (what fun to wipe the Windows install, which curiously, apparently irreparably froze up after all the updates were done, the whole thing to be able to say “yeah, but Windows worked on it!” — which it didn’t!), another system that just about hasn’t been used since and after a few months has now been removed from the upgrade cycle, another system that finally died or at least on whose ghost I have given up, and a finally a replacement system for said “death in the family”.

One of the reasons why I always say “I’d love to go back to CentOS if it weren’t so hopelessly obsolete” is that it’s stable and has a long life to it (something like typically 7 years) — Fedora *has* been good to me since I started using it from version 9, and hence with CentOS you don’t have to upgrade every six months like with Fedora — oops, that’s every 12 months or so — given the support cycle (wink wink). 🙂

Problem is that when you have several systems, you’re still doing installs every 6 months or so if the systems aren’t in sync with each other; further one of the consequences of using second hand or third hand computers, buying new computers, upgrading parts and hard drives, and even trying out another distro at least once is that your systems are hardly every likely to be in sync for the whole 13 months or so lifespan of a new-version-released-every-6-months distro like Fedora. And of course, that someone who would like to avoid having to do new installs every 6 months is going to upgrade a system that is out of sync to bring it in line with the others in the hopes that “this will be the cycle when I get to enjoy the whole lifespan and not have to upgrade 6 months from now”.

Hence the ideal of trying to avoid the “install every 6 month habit” by syncing the installs with each other when a single new install is done in order to hopefully avoid having to reinstall in 6 months is fallacious when you have at least two systems — in fact, you end up doing the opposite since you not only are installing (or re-installing) at least once every six months for one legitimate reason or another, but you end up doing multiple installs, many of which are unnecessary in and of themselves, every 6 months, just to keep everything in sync. And as such, the “install every 6 month habit”.

Of course, I have often been enjoying the process despite myself; in fact, I’ve managed to put together an ever-increasingly long list of steps to take from start to finish when installing a system (which I’ll be presenting to one of the local LUGs in a few weeks.) Fortunately, my computers are purely home desktops or hobby servers without any critical processes on them, and my brother at least humours my habit by doing those little bits that are still beyond my ever-increasing sysadmin skill set (which of course is growing with each install cycle). And in the process I’m gaining a practical appreciation for what I’ve known all along since I started using Linux in 2006 and started using CentOS: The likes of Fedora and Ubuntu may be great, but you have to re-install every 6 months! Who wants to do that?!?!” (Apparently, I do. 🙂 )

It must be interesting having multiple production servers with multiple versions of a given distro, let alone more than one distro (ie. a mix of CentOS, Debian, SuSE, and for some good fun, Slackware). Good thing that usually having “the latest and greatest” usually isn’t as particularly important on a server so that it can actually have a useful life. Must be hard for the likes of Red Hat, for instance, when it must add new drivers all the time, but in order to keep from breaking compatibility and adding “bad change” into the distro other things don’t happen (things like the HPLIP version that is one incremental subversion or whatever it 0.x increments are called behind the minimum requirements for my 2.5 year old printer, and which has since gone through several such incremental upgrades and at least a whole version upgrade since.)

Fedora 12 installed — I’m a linux addict with an install every 6 months habit

Well over the past couple of weeks I’ve just installed Fedora 12 on three systems — mainly because I got a great great great new P4 3.0GHz home server, which I have been considering using as my desktop while using my current desktop as the server, a P4 2.8GHz.

To my dismay I have done this 6 months after I made a point of having the same version of Fedora on all my computers so as to avoid the “reformat a system every six months” treadmill that I was on by having different versions, because, well, my old server died and of course there was no point to putting a 6 month old version of Fedora on which I would only *have* to change 6 months from now, anyway … Sigh, the bliss of using CentOS, were only it not so completely obsolete, I would love to use it again … However on the other side, Fedora is the crack cocaine of “latest and greatest”, so for the moment there’s no going back!

All of this started back in, what, September, if not before; I couldn’t get the 80 gig drive and the 500 gig to play nice together, or so I thought. There *was* an issue with different spin speeds, but wait folks, there’s more. When I *did* have the 500gig as the boot disk, something seemed off with the amount of available storage, although I wasn’t fully aware. When I finally brought the 500 back as the boot drive, the installation went well several times with Fedora, then with Centos 5.1 (which would have been promptly updated upon reboot.) Except, the first reboot wouldn’t work, the system would freeze, and the keyboard buffer would fill up real quick. Forums were of little help, with sufficient dead ends and apparent red herrings. Finally, I started figuring out on my own that the BIOS was way too old to recognize such a large drive, and flashing it with a “new” bios would have required a lot of fun with WINE, which I wasn’t really wanting to get into using a live CD.

Christmas and a new server came along, and I’m up and running with a desktop upgraded from june 2008 — CentOS 5.1 to 5.2, July 2008 some version of Ubuntu, December 2008 Fedora 10, July 2009 Fedora 11, and now January 2010 Fedora 12 … and a netbook, a laptop which is no longer used, an old server, and a new server following a similar route for much of the way each. So much for even taking advantage of Fedora’s “1 month after the release of the second version following” … I’m still upgrading every 6 months!

As a result, though, I finally now have refined my “to-do” list when installing a machine so that it’s not so much of a hassle, and in fact two of the three setups were not only a breeze in and of themselves but the to-do lists also made the rest of the work a breeze, too. Of course my brother told me two years ago that his list was 300+ steps long and he’d found a two year old such list, that was only about 120+ items long. My list is currently somewhere around 58 items long depending on how you count it … I wonder how long it’ll take to get to 300? 🙂

However, I had problems with the desktop right after it was installed like a breeze, the disk boot sector died (I expected it would anyway as of about 6 months ago) and funnily enough the mem stick on which the setup worked like a breeze before suddenly wasn’t cooperating. Gotta figure out what was going wrong with UNetbootin creating the ram stick images from ISO, in which, curiously, the boot image required after the disk formatting in Anaconda wasn’t being properly copied or at least activated on the ram stick.

Anyway, I think I have to work on getting the most out of the system, I bet that months from now Fedora will find a way to make me upgrade again, with that lucky number associated with it and all … 🙂

The new Google OS

Well for those of you who haven’t heard, internet darling Google announced in the past day or two that it will be releasing a new OS expected in 2010 (here’s my archive).

I had a few reactions:

– Google getting headline news should make it interesting, and they have the money and clout to be a real competitor. I saw the news about the new Google OS by watching the morning news and one of the taglines was “Google to launch operating system”. Sorry, Ubuntu only gets headline news within the likes of gearheads like me (see below), and it’s a footnote at best when people talk about that South African Space Tourist.
– What will “it” be? Linux? Google-Hurd? Open-source? GPL? BSD-licence? Apache Licence?
– I wondered what it would be about. Goobuntu? Ahhh, it’ll be an internet-centric linux distro — meaning, even though it’s obvious that it’s meant to be a MS-killer on the netbooks (with the possibility that it could be released, with appropriate changes, for the desktop too), its main comparison will be gOS. (Insert tongue firmly in cheek here. Then bite.)
– It’ll only have any way of working if it A) deals with the problems Fedora has out of the box (flash, mp3, avi, DVD, etc.) by no doubt including such support out of the box, and generally be AS GOOD POINT FOR POINT as MS, and then some, and B) do something better than MS — and be something that people want.
– It’ll have to likely change the computing paradigm. The cloud computing paradigm has been touted for about seven years or more now and has only been taking off in the past year or so. Google has been slowly eroding MS with things like gmail and google docs, alongside Firefox and OpenOffice.org, and generally contributing to opensource and other projects, but I’m wondering when the breakpoint will be when suddenly EVERYBODY drops MS and goes somewhere else, or rather the pie becomes properly split up such that what’s under the hood matters less than what goes on on the screen. Oh, and people don’t like change. Resistance to change is one of Open Source’s, no scratch that, “any alternative to MS”‘s biggest enemies.
– My original take on the above was that Google *would* be the people able to push things beyond the breakpoint.
– I’m wondering if it will have to go on par with MS by pulling a Novell to integrate MS-files nicely.
– Ahh, “machines with it will be sold starting in 2010” — it was but also wasn’t as specific as that … will there be a slice in it for Google? Or will there given away the way other distros are, but have insidious settings that encourage the user without realizing it to go to some web page that has google click ads? Or … what’s in it for them?

Then of course, I’m listening to one of these “deal to the lowest common denominator then add 2 points of intelligence” syndicated talk radio hosts who’s got a guest talking about this subject. To set the stage, the previous topic he discussed was a videoclip on YouTube of a person using both hands to shave his head while driving and whether there should be a law against such a thing, which he caps up with the likes of “there should be an anti-moronification law against such morons.”

To be fair, the stance he and his guest take is targeted at most people who inexplicably (to me, anyway) have no clue that there *is* an (easy) alternative to MS on the PC, besides the Mac, which he rightfully puts in a class of its own. And, Linux *is* mentioned as an available alternative, but “it’s pretty much for the gearheads”.

Here’s what I sent him, I was so riled up:

*****

Forget Ontario hair-shaving idiots making the roads less safe, I wonder about those on the radio who say linux is for “gearheads”.

I suppose I’m a gearhead, I do indeed like computers for their own sake beyond the day to day usefulness they present.

However I’ve been using various versions of linux for the past several years on my PC and take great pleasure in overwriting any existing MS format on any new computer I get — over the past three years, that’s about 5 computers, formatted a few times over on some. Some are older and more archaic than the netbooks your piece mentioned, let alone today’s top of the line desktops, and I’ve been using them for desktop uses, not server applications. On them are full OS’s that are not stripped down — unless, of course, I were to have chosen one of the minimalist versions — and interestingly are not all that slow.

There are several versions which are geared toward the “average” user. Most of the more common versions can do all of the day to day uses that were mentioned in your piece and are on par with — sometimes superior to — MS. I use a version that is a cross between the “gearhead” market and day to day usage. I recommend to newcomers Ubuntu, which I do not use. Virtually all users of MS however would be able to use Ubuntu, available at ubuntu.com, with no difficulty, and it is the most popular of the linux versions and is not aimed at the “gearheads”.

I was incredulous listening to the show to hear that people still think that MS is the only option for their PCs. I suppose that the few who have heard of linux figure that something given away for free is worth the money paid for it. Au contraire, MS is less configurable and as you know virus prone as compared to linux; for the virus part, you have to pay more to get properly protected. Linux on the other hand is safer, faster, and free compared to MS.

I found your guest informative but I found the bias toward linux not being a competitive alternative on the desktop — which it has been for years — compared to Windows “very interesting”.

*****

Oh, I do think that the driver in Ontario is a complete moron. 🙂

And Mr. Shuttleworth, please note that I *will* recommend Ubuntu to the general public since the learning curve is easier than even Fedora’s.

Oooooops, I was wrong … (so what else is new?)

In a previous post, “I may just have that reason to get rid of Ubuntu” I stated that the thing that killed Ubuntu for me was the difference in how OO.o on Ubuntu 8.04 and Fedora 9 deal with the “notes” function in a document. The versions of OO.o in question were — according to distrowatch.com — 2.4.0 for Ubuntu 8.04 and 2.4.0 for Fedora 9. As of today’s date my F9 notebook has updated itself to 2.4.2, so to be fair I imagine that in the past year Hardy Heron has had some updates as well.

Today I stumble across this little gem from the OpenOffice.org website:

Improved Notes Feature in Writer

“In the past; notes in OpenOffice.org were just displayed as small yellow rectangles within the text. This was not very intuitive and user friendly. With version 3.0, OpenOffice.org got an advanced notes features which displays notes on the side of the document. This makes notes a lot easier to read. In addition, notes from different users are displayed in different colours together with the editing date and time.”

Ooops.

I went off on a holy rant, wondering why the heck Ubuntu has changed a few things more that it arguably needed to, when in fact … well, it apparently hadn’t: The annoying yellow dot was a function of OO.o to begin with. At that time. If anyone was changing things, it was Fedora backporting this function some time last fall, assuming that it wasn’t OO.o doing it, or adding a preview into the version that Fedora grabbed and included in F9 — in keeping with Fedora’s usual policy of not using custom patches not necessary to Fedora integration or backporting updates, instead opting for rapid changes, new releases, and submitting bug and improvement patches upstream instead.

Which is perhaps not saying much since at release time, both distros were using 2.4.0; rather, it only raises the question of why things are different between two nominally identical pieces of software, and perhaps lifts blame away from Ubuntu.

Oh well, I still don’t like Ubuntu. 🙂

In the meantime … I wonder how this is explained given that both distros apparently had the same version of OO.o at release time. I wonder if the feature was backported in Fedora. Or if it was backported by OO.o and Fedora simply passed on the change. Or … ?

And in the meantime as well, I wonder about the notes function in previous versions of the 2.x series of OO.o acting in “the new way” at least back to 2006 — again in the same post, second to last paragraph:

“The appearance of the notes in the margin is not a recent occurrence in OO.o, at least in the 2.0 series: back in August 2006 under CentOS 4.4” OK, this is still the Red Hat family” I received a document with the notes visible in the margin (being a work contract I declined the document and asked that they resend the proper version, please.) I was using the standard OO.o 2.whatever downloaded and installed directly from openoffice.org (since the CentOS 4 series originally came wih OO.o 1.5.something series; I’d been using the OO.o 2.0 series for close to a year at that point under Windows before I’d made the switch to linux.)”

Printing PDFs

I’ve just had an interesting object lesson in the differences between two different pieces of software that have more than essentially the same function.

Today I had an important PDF document to print out at home instead of at the office. For the purposes of practical convenience, it was far better to print it out at home and just deliver it to the office than spend the extra time at the office (5-10 minutes) turning on my office computer and printing it out there on a printer I knew would have no difficulty dealing with it, having printed out a few dozen identically-generated documents on it.

On my pretty much stock Fedora 10 box, I use the Evince Document Viewer 2.24.2 using poppler 0.8.7 (cairo) for the Gnome desktop to display and print PDF documents. So far, I’ve been satisfied.

The PDF’s layout had margins beyond my printer’s abilities. And of course the most important parts of the document, being right at the edges of the margins in this document, were being cut off in the process of printing out the document. A reduction in the print size was not useful since the vital information was on the end of the document being cut off in the margins. I suppose I could have tried rotating the document to try to see if the cut off part would not contain crucial information, which I didn’t think of at the time. Both these strategies, however, miss the point: If the original document has very narrow margins, something is going to get cut off no matter what; not exactly desireable.

I did try something that happened to involve a Windows box (ughh) mostly because it had a different printer, and you never know how things behave differently with different equipment.

Not surprisingly, the windows box happens to have an Adobe viewer installed (I avoid that box as much as possible; I don’t even maintain it, that’s my brother’s job. 🙂 ). I click to print the document and whaddya know, in the print dialog there’s an option to fit the document within the printable area. Document printed, convenience secured.

Now what I would like to know is how much of the print window in my desktop is governed by HPLIP, how much by Gnome, how much by CUPS, and how much by the application invoking it at the moment. So I did a little experiment: Always selecting my printer, I opened a print dialogue in Evince Document Viewer, OpenOffice.org (3.0.1), Firefox (3.0.7), The Gimp (2.6.5), Xpdf (3.02) which I intentionally installed for the purpose of this experiment, and gedit (2.24.3) (on which I’m composing this blog). Besides Xpdf, each appears to have the same base, and except for Evince Document Viewer, each also adds a function tab of its own. Xpdf, on the other hand, has its own stripped-down interface — either invoke the lpr command or print to a file.

Here’s a quick table listing the tabs listed in the print dialogs available in five, off-the-shelf standard installs of Fedora 10 software, with my printer selected, plus Xpdf, which was installed directly from the Fedora repositories without any modification of settings or whatever on my part:

OpenOffice.org*: General; Page Setup; Job; Advanced; Properties
Firefox: General: Page Setup; Job; Advanced; Options; Properties
Document Viewer: General; Page Setup; Job; Advanced
The Gimp**: General; Page Setup; Job; Advanced; Image Settings***
Xpdf: Xpdf has its own stripped-down interface
gedit : General; Page Setup; Job; Advanced; Text Editor

* There is an Options button in the “Page Setup” tab for OpenOffice.org.
** The Gimp treats my “special” PDF as an image much like any other, and automatically sizes it to the current settings, much like it would handle a .png or .jpg image
*** The Gimp has an option to ignore the margins; see above note

Not one, besides The Gimp, has an option to fit the document within the printable range, and The Gimp only indirectly, because of the way it seems to handle PDFs by default as an image to be manipulated. And of the others, to be fair, only Document Viewer and Xpdf deal with PDFs — even FireFox delegates PDFs to the Evince Document Viewer by default.

Then I did another little experiment: I installed Adobe Reader 9.1 (that license is interesting, pretty convoluted, and makes me wonder whether I may use the installation at all; in any case, I’ll be getting rid of it since I really only installed it for the purpose of this experiment, and decided a while ago that having 2 PDF viewers above and beyond that which is available in the basic distro installation is superfluous unless ther’s a particular reason for it.) And what do I see? A new print dialog that reminds me of the one I saw earlier on the windows box. Interestingly, it has “fit to printable area” and “shrink to printable area” options.

So my little experiment has led me to the following conclusions:

– many pieces of software, presumably not wanting to reinvent the wheel, rely either on the OS or I suspect, at least in this case, the desktop environment for its print dialogs;
– some software authors do want to reinvent the wheel, such as to “do it their own way”, or to be completely platform and environment independent, and therefore make their own dialogs;
– some software authors want to do extra things but don’t want to reinvent the wheel, so they have a wrapper for to add extra functionality to an existing base;
– in my documents, I shouldn’t try to stuff as much content as possible into each page too far, at least not by playing around with the margins.

Looks like something for the Evince authors to toss in. Assuming, of course, that — without fundamentally changing a document — resizing a PDF and/or its content to the local printer’s printing range is a really useful feature, such as to deal with awry margins, or PDFs sized for A4 instead of letter sized or vice-versa. 🙂 And that such non-conformities and/or their prevalence make it worth my using the Adobe Reader, licensing issues aside. Or that another PDF reader out there that has that functionality.

Hmmm … OO.o differences, Fedora, and Ubuntu

In my post I may just have that reason to get rid of Ubuntu I whined about minor differences between “stock” OpenOffice.org appearances and functions and those I used straight off the OO.o website as well as what ships with Fedora.

This blog (here’s my archive) explains a bit why: It says “Many Linux distributions ship ooo-build. … Fedora ships a modified OpenOffice.org, but Fedora does not use ooo-build.” Which means that in keeping with Fedora’s usual policy, it ships upstream versions of software with only reasonably required modifications to make it work under Fedora. When I was using CentOS, I was using the vanilla version directly from OO.o.

That explains a few things. It doesn’t necessarily justify my whining — nor all the changes Ubuntu or other distros (or even Fedora) make, but … Why mess with a good thing? 🙂