Archive for July, 2009

Netflix Has a Developer API

July 27th, 2009

I wasn’t incredibly happy with the movie synopses I was getting from IMDB. They’re generally pretty crappy. I went looking around to see if I could scrape the Netflix synopses, and lo-and-behold, Netflix has an API!

In another open source project I’m working on, I have a need to learn GTK+. So I figured the easy way to learn GTK+ was to start with php-gtk. It’s more-or-less a replica of the gtkmm OO interface, so I set out to update my little movie categorization script with a GTK+ interface. After learning the ropes, I finally have a nice interface that queries Netflix and returns all of their data for display.

This is what I have so far (keep in mind this is all in PHP):


When you click on a movie in the list, it queries netflix and fills out the description pane. So far it’s really simple, but hopefully I can use this to generate something that will categorize movies specifically for a uPNP client. I can’t put any source code out yet since I’m not too sure how the Netflix API deals with publishing an app. Right now, it has my personal developer key hard coded, and I only get 5000 queries per day.

Here’s a video (and of course, you’ll need Firefox 3.5):

General , ,

Logitech Harmony Remotes and the PS3

July 26th, 2009

First things first: If you’re only minimally using your PS3 as a multimedia device, the Logitech Harmony remotes and Playstation 3 adapter are not for you. If you use your PS3 as a media hub, then it is a must have item.

My experience started when I realized my cable box DVR remote didn’t have a 30 second skip feature. I figured if I was going to buy a universal remote, it needed to be universal. After a bit of research, I ended up with the Logitech Harmony 880. From Amazon, with the PS3 adapter, it was about $180.

My first instinct was “Did I really just spend $180 bucks on a remote control.” Yeah, that’s a good chunk of change. If you’re looking for a cool toy to play with, I recommend it, but if you’re trying to save money, having a few different remotes and/or using the PS3 controller isn’t that big of a deal.

So if you were to ask me if I got my $180 bucks worth, my answer would be, “no.” However, that doesn’t mean that this thing isn’t really cool.

Instead of the standard style of universal remotes, Logitech has given us “activity based” universal remotes. Depending on the activity that you choose, the remote buttons react differently. For example, with a standard universal remote, if you want to switch from your DVD player to your cable box, you push the TV button, select the appropriate input, then push the cable button to control the cable box. If you have a receiver/amplifier that needs to change, you push that button and change the inputs as required.

With the Harmony remotes, you select an activity. For example, this video shows me switching from watching a show on a cable box to moving to my movie collection streamed from the PS3 (requires Firefox 3.5):

The important part to note is that I’ve only pressed one button. I set up the remote with a “Watch Movies” button. I push the button, it turns on the PS3, switches the TV input, and automatically navigates to the movies in the PS3 XMB. If I want to go back to watching TV, the remote turns off the PS3, switches the input back to the cable box, and brings up the guide.

Of course, all of this is configurable and you can create whatever macros you want. All in all, it’s a nice device. If you’re a fan of a single remote, this is a must have item. If you want a cool toy to play with, this is a must have item. If you just use the media features occasionally, skip it.

PlayStation3 , ,

HTML 5 and the <video> tag!

July 11th, 2009

With Firefox 3.5, we get our first real chance to use the <video> tag. And man, it is awesome. If you’re using IE, Firefox 3.0, or Chrome, you won’t be able to see it. It seems there’s all kinds of debate about the codecs that the browser should support. Right now, Firefox uses ogg theora/vorbis.

Apple is apparently complaining about the ogg format. They say there are possible patent issues they want to avoid which is a weird claim since ogg is specifically designed to be free of such restrictions. More likely, they want to implement their own proprietary .mov format or possibly mp4. Either way, I’m sure it boils down to Apple wanting more money. This would also hamper open source project’s ability to use the video tag since there is no way to license the corporate proprietary formats.

Microsoft is strangely quiet about the whole thing. I’m guessing they’ll put out only WMV support and call it a wrap. They may just embed WMP into the webpage to handle the player. I wonder how long it will take to initialize that.

Anyhow, here is the video tag in action (OGG Theora variable bit rate, Vorbis: 96kb/s 2 channel):

Hopefully Sam and The Chin won’t mind me using this clip for a demonstration.

On a side note, is there a way to make Windows Media Player play H.264 encoded movies? Does it not support that capability?

General , ,

The PS3 Slim and All the Hype

July 4th, 2009

There’s been a lot of Buzz about the PS3 Slim version in the media and all kinds of speculation and rumors flying about. I don’t personally know anything about any of it. What I do know is what I want to see from Sony. I want a 19″ stereo-equipment-style looking device.

Maybe I’m too old. I don’t want a flashy sleek looking case with special LEDs and all that non-sense. I just want a normal looking piece of equipment that fits in a stereo rack and is stackable with other equipment. I don’t like playing a game of Jenga and/or Tetris with my various game consoles.

Sony made the PSX (not to be confused with the original Playstation) for a while, but it was a DVR and a PS1/2 mixed together with an $800 price tag. What I’m talking about is a standard PS3 with a normal audio receiver style form factor: 19 inches wide and a flat top.

All this talk about the PS3 slim is great and all, but who really cares? Give me a standard A/V equipment form factor and I’d replace my current PS3 (as long as they don’t jack up the price). It should actually be cheaper to manufacture since venting would become less of an issue.

I can’t be the only one out there that sees a video game device as a standard piece of A/V equipment. Why are there different rules for consoles?


DVI to HDMI overscan (screen edge cutoff) on an HDTV

July 3rd, 2009

Update – 4/1/2010: Latest nVidia drivers have overscan correction built in

Well I learned something new recently. I have a friend that’s making the Ubuntu switch and he called me up with a bizarre problem. He’s using an nVidia card (although other cards have the same issue) with a DVI out port to a DVI->HDMI converter to an HDMI input on a 26″ HDTV that he uses as a monitor.

He called me up and described the problem and I confessed that I had never heard of this before. All 4 sides of his sides of his screen were getting cut off. He could only see part of his menu bars at the top and bottom and the left/right edges were cut off as well. After some Googling, I at least found the name for the problem: overscan.

And once I figured out the name, that’s when my Google searches became eye openers. There are a lot of people out there with overscan problems and there are very few solutions in Linux. The Windows nVidia drivers allow dynamic overscan correction inside of their driver toolbox. The X server nVidia drivers have no options (for DVI out… for TV out there apparently are).

The problem, as I understand it, is that the PC is sending a DVI PC style output, but the TV is reading a HDMI TV style input. As such, the TV thinks it’s receiving a TV signal and acts accordingly. If your TV has a DVI input, it should treat that as a PC input and give you 1:1 pixel mapping (which is what you’re looking for). If not, you’ll need to adjust for the overscan on the PC side. Some TVs even have an option to treat an HDMI signal as if it were PC. Check your TV’s manual.

Anyhow, there are a lot of people asking for help for this issue but is very hard to find any actual information.

Option 1 – Manually

I don’t know if this works, but it looks like good info. If you’re looking for a way to fix this (and you’re ready to spend quite a while doing it), you should read this:

Ubuntu Forums: Nvidia, Modelines, Overscan…8.10

Basically it’s trial and error to get the correct X server config’s Modeline. It’s mindboggling that no one (especially nVidia, which seems to care about Linux a little bit) has put out any definitive information on this topic.

Option 2 – A little less manually

I definitely don’t know if this works. I don’t know if anyone has even tried it. If this works/doesn’t work for you, post in the comments.

You can see if the Xfree modeline generator will give you something that works. I don’t really understand what all the modeline timings mean, but here’s a shot in the dark (You’re probably desparate at this point anyway… and I have no way of testing this so I don’t know if it even works at all). Also, I’ll give the same warning everyone gives on this… I take no responsibility at all of this damages your television. Try this at your own risk.

First things first, back up your xorg.conf file (/etc/X11/xorg.conf) somewhere safe (like your home directory).

I wrote a quick program that will help you determine your visible screen size:

Source: findcoords.c (source)
Binary: findcoords (compiled on Ubuntu 9.04)

If the binary doesn’t work for you or you’d prefer to compile from source, you’ll need the libx11 development packages installed (as well as the standard stuff like gcc and whatnot). On Ubuntu, running “sudo apt-get install build-essentials libx11-dev” should do the trick. To compile it run: gcc -lX11 -o findcoords findcoords.c

Now run it by typing ./findcoords

It’ll tell you to click the upper left and bottom right corners of the screen. Get as close as possible. You want the very point of the cursor as close to the edge as possible. That means in the bottom right, you should only be able to see about 1 pixel of your cursor. When you’ve done that it’ll calculate your viewable screen size. It will output something like this:

Root Window Size: 2880x900
Viewable Size: 2764x798
Your screen is cut off by the following number of pixels:
Left  : 31
Right : 85
Top   : 24
Bottom: 78

Armed with the actual visible screen size, head over to the XFree Modeline Calculator (it works for Xorg too).

1. Enter the values under “Monitor Configuration” if you know them. If not leave that section blank.
2. Under “Basic Configuration” enter the viewable size that got output from findcoords.
3. If you know the max refresh rate for your TV, you can enter it here. If not, just use 60Hz.
4. If you know the dot clock frequency enter it as well, otherwise, just leave it blank.
5. IMPORTANT: If you’re TV is interlaced at max resolution (i.e. 1080i), check the interlaced button.
6. Click the “Calculate Modeline” button and it should give you a modeline at the top of the screen.
7. In your xorg.conf file, put the modeline it gives you into the Monitor section
8. And this line to your Monitor section as well:

Option "ExactModeTimingsDVI" "TRUE"

9. Now, to use this, you’ll need to add this line to your Device section:

Option "UseEDID" "FALSE"

10. Then in the Display section, add a line that LOOKS like this, but define the mode specified in the modeline that the generator gave you:

Modes "1960x1080@60i"

In other words, if the modeline generator spit out:

Modeline "1816x980@60i" 65.89 1816 1848 2096 2128 980 1002 1008 1031 interlace

You would put the following in the Display section:

Modes "1816x980@60i"

That text has to match EXACTLY. When it’s all said and done, you should end up with an xorg.conf that looks something like this:

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "DELL S199WFP"
    HorizSync       30.0 - 83.0
    VertRefresh     56.0 - 75.0
    Option         "ExactModeTimingsDVI" "TRUE"
    Modeline "1816x980@60i" 65.89 1816 1848 2096 2128 980 1002 1008 1031 interlace

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 9800 GT"
    Option         "UseEDID" "FALSE"

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
        Modes       "1816x980@60i"

Give that a shot and see what you get. Can’t be much worse, can it? If it doesn’t work, just revert back to what you had by replacing your xorg.conf file from the backup. If you get any halfway decent results at all, let me know.

More terms to know:

1-to-1 pixel mapping: If your HDTV (as a monitor) supports this option, chances are this will solve your problem. This means that every pixel sent by the PC will be mapped to a pixel on the screen (i.e. disable overscan).

Full Pixel: This is the same as 1:1 pixel mapping

Modelines: Definitions of video modes that control the display size in the X server

Overscan: Part of standard TV input where a percentage of the edges of the screen are cut off. Not noticeable for normal TV viewing, but very noticeable on a PC desktop.

EDID: Monitor/TV device information telling the PC what modes are supported (stored in the monitor and not configurable)

Good luck.

General , , ,