Google Scholar Does Citations

Google Scholar just added a citations feature which, among other things, lets you get a BibTeX formatted citation.  This is so handy, and I’ve wondered for a while why they didn’t have that feature in there.  Often you can go to the IEEE page or whatever to get the citation, but some journals make you login (IEEE, why do you do that?) and anyway this makes one less click.

 To copy a formatted citation, click on the “Cite” link below a search result and select from the available citation styles (currently MLA, APA, or Chicago) […] You can also use one of the import links to import the citation into BibTeX or another bibliography manager.  We hope that simplifying the chore of citation formatting will let you focus on what you really want to work on: writing a great paper!

Thanks Goog, that is what I’d rather focus on.

Python’s super()

I’ve been writing a lot of Python lately, and there are some things I love about it, some things I’m uncomfortable with, but nothing I hate.  The super() builtin function makes life a lot easier when you’re using Python for object oriented code and have some inheritance going on.  This blog post describes the best way I’ve seen it used so far.

First of all, in Python3 you no longer need to write “super(CurrentClass, self).foo”, “super().foo” does the right thing generally.  This is great, because it’s easier to change class names, or copy a line of code, without breaking everything.

The blog post describes a simple way of passing arguments to parent classes, too.  The classes must work together, passing all arguments as a dictionary, and callers must name the arguments.  I think that’s good practice anyway, especially because when you require inheritance things are already getting complex, and the extra verbosity helps make your intentions explicit.

The Linux Graphics Stack – Meta

Jasper St. Pierre explains the Linux graphics stack in detail.

I think X11 is the most complex piece of software I’ve ever tried to work with.  It has been a long time since I’ve had to open up xorg.conf (and before that I think it was x11.conf) to fix something that’s broken, and that’s a testament to the quality of modern versions of Ubuntu and Debian.  But there was a day when I had to open that massive config file up quite often to get the video card, or mouse, or keyboard or whatever working properly.

Maybe 5-10 years ago, trying to get games to work well was a pain.  Frequently when the distro packages would update, or the video card driver would update, or I’d update the kernel, the stack of cards that is OpenGL would break.  I remember being confused by what all the libraries (libGL.so…), packages (Mesa) and config files were doing, and how they all fit together.

Well, this blog post puts it all in context.

3D rendering with OpenGL

  1. Your program starts up, using “OpenGL” to draw.
  2. A library, “Mesa”, implements the OpenGL API. It uses card-specific drivers to translate the API into a hardware-specific form. […]
  3. libdrm uses special secret card-specific ioctls to talk to the Linux kernel
  4. The Linux kernel, having special permissions, can allocate memory on and for the card.
  5. Back out at the Mesa level, Mesa uses DRI2 to talk to Xorg to make sure that buffer flips and window positions, etc. are synchronized.

2D rendering with cairo

  • Your program starts up, using cairo to draw.
  • You draw some circles using a gradient. cairo decomposes the circles into trapezoids, and sends these trapezoids and gradients to the X server using the XRender extension [or] libpixman […].
  • The X server acknowledges the XRender request. Xorg can use multiple specialized drivers to do the drawing.
    1. In a software fallback case […] Xorg will use pixman to do the actual drawing […].
    2. In a hardware-accelerated case, the Xorg driver will speak libdrm to the kernel […].

The post goes into much more detail about how all the pieces fit together.  To me, the most enlightening part was where Mesa actually fits into all this.  I’ve always been confused about why I need a copy of Mesa when I’ve also got the GL libraries provided by Nvidia or whomever.

One thing I still need enlightening on is why Nvidia still keeps their driver source secret.  What kinds of proprietary things would Nvidia be doing in there that’s critical to keep secret?  Clearly they want their hardware designs proprietary, but surely the driver source can’t reveal too much about the hardware design that would actually help a competitor…