Archive for the ‘technology’ Category

upgrading ubuntu - the horror

February 1st, 2007

I can't believe how long it's been since we started using distributions, picking the one we like and staying with, because we perceive certain advantages the others don't have. And when a new version comes, we install it, replacing the old. How much longer will it be before it is possible to *upgrade* a distro without friggin breaking it in half??

I installed Ubuntu Dapper on the "family" pc back in Norway in.. July...ish. I'm back in town now for a few days and as I logged in, it still worked beautifully. Nothing broken, no problems, nothing. But Edgy has been out for some time, and keeping up with updates is generally recommended to stay more or less in the loop long term. So I decided to update. I looked up how in the documentation and followed the instructions. It started off so well that I was impressed. It first removed all my "custom" sources in sources.list and then it set off. I had used Automatix to install multimedia stuff, but I think that was the extent of my "modifications". There was one entry for the latest amarok in sources.list, but the rest I think were standard.

But it did not carry on so. At one point I got a big fat warning about some package not being able to install/configure/whatever. Then I got a dozen more of them. Once the process was done, not throwing a fatal error my way, the little icon indicated that "a reboot is required", so I did, thinking I could probably fix the bugs when I do. It was not to be. Upon boot, X wouldn't start. I was getting strange errors about a permission problem with /dev/null or something. The system was completely broken and I really didn't feel like resuscitating. I use Ubuntu because it's no hassle and "just works". :rolleyes:

And there ends the tale of the upgrade. Once again, after so many years, an upgrade between versions leaves the system completely broken. How much longer do we have to wait until this ceases to be a problem? I can appreciate that it's complicated, but how many other complicated problems have been solved?

undvd: dvd ripping made easy

January 29th, 2007

I always found ripping dvds to be a huge pain, because of how complex the process is. There is a million ways to convert a dvd into avi format, a myriad of settings to play with, options to tune for performance, for size etc. That's great if you want to tinker. But it's much more difficult to give a straight answer to the question "how do I rip a dvd?" without going into all these details. I for one would like a simple way that would work on any dvd everytime.

So that's what I set out to do. It took me *a lot* of testing and playing with the settings to find a recipe that both gives great quality and doesn't take too long. And still there may be, and probably will be, cases where the results aren't great. But for my own use, it works very well. My main goal was to hide as many details as possible from the user, turning the complicated maze that is mencoder into a single button to push. As it turns out, however, it's really hard to abstract away everything completely, so even with undvd there is a (hopefully modest) learning curve.

undvd is a collection of a couple bash scripts, which I decided to base on lsdvd and mencoder, part of mplayer. In doing so, I wanted to use the disc as little as possible, considering all the problems I've had with reading dvds in the past. I also found out that by extracting the vob, some of the information about audio/subtitles is lost, so I first clone the disc with dd, and then go to work on it. The script starts off by making an image of the disc, whereupon the disc is no longer needed.

First, to see what's on the disc, run scandvd.sh.

scandvd.png

At this point, you have to decide on which title(s) to rip. If you don't know what they are, scandvd.sh suggests using mplayer to find out. Once you know what to rip, you run undvd.sh with the chosen options. Just keep in mind that the files will be created in the directory you run undvd.sh from, so make sure you have enough disk space.

undvd.png

What is worth noting here is everything that you don't see. mencoder is run in the background, with a host of complicated settings, but you don't see the horrifying output. You only see the status of what is happening, the settings you chose, and mencoder's estimated time to completion. Sure, the full log is there if you want it, just say the word. But unless something goes wrong, you don't need to see it, this will do just fine.

undvd_result.png

After ripping is finished, what you'll have is the files shown. 01.avi and 02.avi are the titles. disc.iso is the image of the dvd, which you can use to rip more titles still, or just delete. And then there's logs that you won't bother even looking at unless something went haywire.

And that is dvd ripping reduced to one line of output for every title. Simple, isn't it? :)

Get undvd from opendesktop.org.

A technical note

Make sure you have lsdvd and mplayer installed (with support for encoding, x264, xvid, and mp3/mad).

is 2 pass encoding really worth it?

January 26th, 2007

I'm trying to figure out how to rip dvds in near perfect quality, because I can't stand them. Dvds are such a pain in the ass, with their idiotic menus for kids, how the discs so often can't be read properly by the dvd drive, how they will play on one device but not another etc. As far as I'm concerned, the whole thing is broken.

So not really having done this before (I've tried in the past with meager results), there's a lot of angles to cover. Most docs seem to recommend 2 pass encoding, whatever the format. I've experimented with x264 and xvid, and I can't really see a difference between 1 pass and 2 pass encoding when running at 900kbps bitrate. Probably at higher compression it becomes more apparent, but I'm satisfied with that ratio.

technology too complex to handle?

January 13th, 2007

I know it's a strange question to ask. Technology is *meant* to be complicated, it solves complicated problems after all. But I wonder if there isn't a point where we feel it's becoming too complex to be manageable. I say "technology" when I really mean "software", because from a user's point of view, I think it's "technology" they feel they're dealing with.

Now, when I say "too complex" that needs to be qualified. It will always be possible to solve problems, but the more complex they are, the more time it takes. And if it gets to a point where it takes so much time that it's not worth it, well then that's going to be a dilemma.

It's probably fair to say that users are pretty comfortable with software on their computer. It's a familiar environment, and while there are failures sometimes, those problems can be solved by you or "your geeky friend". Everything you need to fix the problem is found in your Emergency Kit, your DVD repository of software, fully legal software if you run open source. In the worst case, reinstall the OS and you're golden.

But since the advent of the internet, we have taken this one step further. "Technology" is all about hiding the complexities of the systems behind OK/Cancel dialog boxes. It's about making complicated things appear simple. And that's a wonderful achievement. The "early internet" *was* simple. Web and email are both remarkably simple technologies, there isn't a whole lot that can go wrong. And I think that to a common user, web and email are transparent enough to understand when it does fail. "Server not found, hm is my internet connection working? Is my network cable plugged in? Is the little light on my DSL modem lit?"

Yes, it does take some basic internet literacy, but we can handle it. But that was then. Right now we have firewalls and NAT adding a new, and highly frustrating, layer of complexity to our experience. When people ask me "how come I can't connect to the forum from home but it works from work?", you can never rule out an evil firewall. When you want to play online games, transfer files over instant messaging, use peer to peer or VoIP (skype) and so on, sooner or later you have to figure out how to route NAT and how to manage your firewall.

The present internet isn't as simple as the "early internet". Web and email are still around, but we also use a lot of other things. Among them - bittorrent. Now, bittorrent is somewhat of a wonder, it's a phenomenally functional technology. It "always" works. It's slower if your network setup is suboptimal, but it works well enough to keep us happy. And so bittorrent has achieved something very precious. It has successfully covered the complexity of the system behind a simple interface. Yes, bittorrent is complicated, noone would deny that. But do you even care? No, because you don't have to.

Contrast that with other technologies you use. Like.. skype. Keep in mind, skype is known to be a *successful* case in this context. But in my experience, it has basically *never* functioned on any of my computers as advertised. At best, I get mediocre sound quality with occasional transmission delays. At worst, it's completely unusable. And I even know how to set up my network.

So, the trend is to masquerade increasingly more complicated technology as "simple". Joel Spolsky calls this leaky abstractions, when something "simple" does not work, and fixing it tosses you into the deep end. There is a point where users are no longer able to help themselves. In the old days, cars were quite simple, so a lot of people would fix their own cars if the problem wasn't too serious. That happens less now, because cars are loaded with fancy technology, and even opening the sealed cover of an electronic steering system would void your warranty.

We want to do more sophisticated things on the internet, so we create technology that is increasingly complicated. When it fails, we're out of luck. Instant messaging is made to appear incredibly simple, but it's not. When you're trying to transfer a file to the person you're chatting with, there is a long chain of conditions that must be satisfied so that the transfer will work. Voice and video chat is more complicated. Video-on-demand-over-bittorrent (rumored to come in the near future) more complicated still.

Right now, it's quite difficult to debug problems for other people over instant messaging. It takes a lot of time and effort, there are many sources of failure. Some problems are never solved at all. I'm still able to debug my own problems, though, but with more and more layers added to every service, I wonder if there will be a time when I won't be able to do that either.

a temperature regulator appliance

January 2nd, 2007

As I'm sitting at my desk all day, reading tons of pages of software architecture theory, my concentration fluctuates. At times I'm totally focused, then it fades out and I have to snap out of it and re-read what I just read. My mind wanders sometimes, in the middle of a paragraph I start thinking about something completely different. But I've noticed that the room temperature has an effect on my ability to focus. It's a balance of opposites. Too cold and I'm cold. Too hot and my head grows tired rapidly. There isn't an ideal temperature. Ideally, I would maintain a different temperature in my head than in my feet.

But regulating the temperature in the room is a good measure. The temperature outside right now is about 3-(-2) degrees, day time-night time. When I open the window and I feel the cool air, I focus much better.

You could potentially create an appliance to regulate the temperature instead of opening/closing the window and fiddling with the dial on the heater. And there could be several ways to do this.

The you deserve to be beaten with a bag of oranges for contemplating this way

Setting the temperature to fluctuate within a given range, at set intervals. This is really stupid, because a constant, mechanical motion like this is almost certain not to be produce the desired effects.

The acceptable way

An acceptable way of doing this would be through statistical analysis of human behavior. Statics are boring, but they kick ass. If you analyzed how and when the temperature was changed by the user, you could model that behavior mechanically. This might depend on the temperature outside, it might depend on the time of the day, it might depend on my form that particular day and so on.

The perfect way

Keeping tabs on the body's functions and responding accordingly. This would address all concerns, because if you can determine the state of the body, then your rationale for response would be just as good as a human's.