Perusal, Synthesis, Bliss

October 27, 2012: installation of Kubuntu 12.10 (The Quantal Quetzal) on LDLC computer Saturne SG4-I3-8-S9H7 bought in the last days

New features of Kubuntu 12.10

An interesting feature is the “guest” user, without password, proposed in the login manager. Even if it is not possible for him to make “sudo”, or even “su jscordia”, it allows to test KDE, gnome, or other window managers in a clean $HOME, without nasty configuration files.

Download

Now, the Kubuntu image is about 900 Mo large, to be used with a USB key rather than a CD.
I remark that now the images can be downloaded directly and through bittorrent, but also through zsync and jigdo, that I add in my package list to be reinstalled at the next installation. Moreover, OpenSuse advises “aria2” for download utility, which supports e.g. bittorrent. I add it also.
First, I have tried to install from the standard AMD64 distribution, because they tell that “It also does away with the alternate installer images, adding advanced partitioning options to the [LIVE] desktop image.”. But I have not found these advanced partitioning options in the manual partitioning mode. This seems to be confirmed by this. So I have instead used the NON-LIVE alternate image, which I have found by a Google search: “Kubuntu 12.10 alternate”. I have not found this image directly on the website, it is more or less hidden.
I have put this image on my 8GB USB key with usb-creator-kde, with success.

Installation

The mother card of my previous computer (MSI) got destroyed: at startup, no BIOS, and it is not a problem of screen. I have asked reparators in a shop at Croix-Rousse: they told me that the only means is to send the computer at MSI, the price being around 300€. I cannot afford waiting one month, so I decide to buy a new laptop: a LDLC computer of 900€:
LDLC Saturne SG4-I3-8-S9H7
Taillé pour les applications de divertissement, ce PC Portable LDLC Saturne SG4 à vocation multimédia dispose d’un lecteur Blu-ray, d’un écran LED mat 17.3" Full HD et d’un contrôleur graphique NVIDIA GeForce GT 650M.
I have installed Kubuntu 12.10 on it without any problem, except for the sound delivered by the laptop speakers (see below).
To get the content of my previous HDD (2”1/2 Western Digital WD3200BEVT (320GB)), the procedure was also to buy an external hard drive box, and put the HDD of the MSI computer inside it. Due to LVM (Logical Volume Manager) on this disk, mounting does not work directly from KDE in Kubuntu. I have first tried with two different external HDD boxes, without any result. Finally I have tried more manually: (i) use the KDE password popup to enter the password of the LVM, then Linux creates the suitable directories in /dev/mapper (ii) type in a terminal, for example for the home directory:
$ mkdir /media/home_hd
$ mount /dev/mapper/vg_vghome /media/home_hd
It works perfectly. However, I obtained some problems when exploring /usr/local/share on the external HDD: I/O errors, etc., and very strange sounds from the box. I thought the HDD was dead, but by unmounting the corresponding partition, and typing:
$ fsck.ext3 -y /dev/dm-9
It did not yield any error, and finished successfully. After that, no problem to access /usr/local/share, and to copy /usr/local/* to my new computer SSD drive.

Disappearance of the alternate CD containing the text installer

One week later, it is now impossible to find the alternate CD, it is no more avaibable at here. See herehere: the text installer is no more available in the “augmented” CD version (about 900 Mo). I NEED the text installer so as to keep my encrypted LUKS/LVM system with separate home partition, following the tutorial of this location.
I have tried the installer of Ubuntu 12.10, named Ubiquity here, it is not able to recognize LVM partitions, exactly as the KDE graphical installer, but contrary to the openSUSE graphics partitioner and the Alternate Ubuntu text installer (update: see however the notes corresponding to my installation of may 1, 2013 of Kubuntu 13.04).
It seems that the text installer is available in the DVD version here:
Just start using a DVD already. Have the features that existed on the alternate LiveCD exist on the new LiveDVD and offer the alternate installer as an option in grub. Anyone who has a computer that’s strong enough to run Ubuntu must have a DVD drive by now.
Maybe this DVD is the one available here? But I have tried it, no trace of this text installer!
The only workarounds seem for the time being here:
If there is no workaround, I will be compelled to switch to openSUSE!
In Kubuntu 13.04 alpha2 release notes, there is perhaps an interesting information here, in the “known problems” section:
The desktop image installer cannot unlock existing encrypted (LUKS) volumes. If you need to make use of existing encrypted volumes during partitioning, then use the "Try Ubuntu without installing" boot option to start a live session, open the encrypted volumes (for example, by clicking on their icons in the Unity launcher), enter your password when prompted to unlock them, close them again, and run ubiquity to start the installer. (1066480)
refering to bug 1066480.

Formatting the previous MSI hard drive: installing several distributions on the same USB key

At the end of the installation, I have copied /home and /usr/local on my new computer hard drive (the first on the HDD, the second on the SSD). Now I want to make the old HDD an external hard drive for backup purpose. So I need to format it and make it encrypted. I decide to remove the Windows Vista partitions from it.
I have first used the “Partition Manager” of KDE to remove all partitions present on this disk, with success. However, it is not able to create an encrypted partition. gparted is no more able to do that. It is always possible to use the live CD of OpenSuse, it is the only GUI capable of that, to the best of my knowledge. So I decide to use OpenSuse; to avoid removing the 12.10 alternate CD version on my 8GB usb key, which I have just installed, this would be interesting to put several Linux distributions on my USB key. There are several sources of information to do that, but the best option seems to be multiboot-usb: see here and here. I have downloaded a .deb proposed on the second address, and add it in /usr/local/packages/installed. I install it, and now I can run this tool with:
$ multibootusb.gambas
quantal-alternate-amd64.iso is not supported: I obtain a popup error. But openSUSE-12.2-KDE-LiveCD-x86_64.iso is accepted to be installed on the USB key. However, the USB key does not work on boot, I obtain an error: “no DEFAULT or UI configuration directive found”. I have tried both by installing syslinux (second tab of multibootusb) or not. So, no way to make multibootusb work, even after some mail exchanges with the author. So, temporarily, I have decided to install only openSUSE on the USB stick, so as to format and crypt my external hard drive. But I have made other tries: see section “Retrying to have...” below.

Installing openSUSE on a USB key

After that, I have tried to install openSUSE-12.2-KDE-LiveCD-x86_64.iso on the USB key with usb-creator-kde: I select the .iso, but it does not appear in the QTableWidget. Then I try unetbootin: it is installed without any error on the key, but I obtain an error on the boot: “could not find kernel image: gfxboot”.
So, I decide to try the same with OpenSuse 12.1: exactly the same behavior with the three tools: multibootusb, usb-creator-kde, and unetbootin.
Then, I try to use the feature of unetbootin which proposes to download a distribution and put it on a USB key: I try OpenSuse Factory_x64, i.e. the development release. But it downloads not a Live version, simply a minimal version for the installation on the hard drive by downloading all the packages from internet.
Finally, I look on OpenSuse wiki for instructions specific to it: here. I apply the instructions to make a USB stick from console. OpenSuse advises “hwinfo” that I put in my packages.
$ umount /dev/sdc1
$ dd if=/home/jscordia/openSUSE-12.2-KDE-LiveCD-x86_64.iso of=/dev/sdc1
Note that on the OpenSuse french page here, it is advised to use dd_rescue rather than dd:
$ ddrescue /home/jscordia/openSUSE-12.2-KDE-LiveCD-x86_64.iso /dev/sdc1
It is “ddrescue” on Ubuntu, not “dd_rescue” as on Suse, but I obtain an error:
ddrescue: Output file exists and is not a regular file.
ddrescue: Use ’--force’ if you really want to overwrite it, but be
aware that all existing data in the output file will be lost.
Try ’ddrescue --help’ for more information.
I have not tried further on.
After a reboot, it does not work: the USB key does not boot, without any error. So I try what is advised on OpenSuse website:
This situation would happen very rarely, but in the even that your computer doesn’t boot from the LiveUSB/DVD from the steps above, you might try the following procedure.
Open a console and do the following as root\begin_inset Separator latexpar\end_inset
After a reboot, same problem. I see a very quick message saying something as “isolinux.bin missing or corrupt”. Now, I try with the 12.1 version: same problem. I finally found the solution from the obtained error message, at here: what must be done is to change /dev/sdc1 in /dev/sdc:
$ dd if=/home/jscordia/openSUSE-12.2-KDE-LiveCD-x86_64.iso of=/dev/sdc
He also advises the use of “sync” and “eject”, but I have not tried, I have only changed /dev/sdc1 in /dev/sdc, such that the MBR is written! It is obvious, now. I create an encrypted partition on my external hard drive without any problem, and it is perfectly recognized as such by my Kubuntu 12.10.
Note that the fact to use /dev/sdc1 instead of /dev/sdc is perhaps the problem in unetbootin: he shows /dev/sdc1 for the device path of my USB key, and it is not possible to change it. However, multibootusb shows correctly /dev/sdc in his GUI.
Now, when using the method using “dd” given above, two partitions appear in KDE, and it is impossible to mount them at the same time, either one or the other could be mounted. On the contrary, only one partition is visible in gparted for example. I have tried:
This did not change anything: two partitions were still seen by KDE. I finally did:
$ sudo dd if=/dev/zero of=/dev/sdc count=1MB
1000000+0 records in
1000000+0 records out
512000000 bytes (512 MB) copied, 56.9837 s, 9.0 MB/s
After that, I created a FAT32 partition in gparted, and it was OK: only one partition was seen by KDE.

Retrying to have several distributions on the same USB key

This is really needed to have at least Kubuntu and openSUSE (for its partition tool) on the same USB key. One month later, I have tried several times to make multibootusb to work: no way. Is there an alternative? There are a lot of software to do that on Windows here, but not on Linux, I have only found the following alternative: here. The installation method is given here. I have dowloaded the latest version of multisystem directly at here and of gtkdialog at here without setting a repository.
Then I have formatted my USB key with gparted.
When I try to run “multisystem”, I obtain:
Gtkdialog version: 0.8.2
LANG:en_US.UTF-8
LANGUAGE:en_US.UTF-8
LANGSEL:English|en|en_US.UTF-8|Ryan J Nicholson|rjn256@gmail.com
 Error: Missing: 1:gettext 2:gettext.sh 3:udevadm 4:gtkdialog 5:zenity 6:update-grub2 7:syslinux 8:xterm 9:xdotool 10:wmctrl 11:kvm 12:sudo 13:gksudo 14:genisoimage 15:blkid 16:nohup 17:xdg-desktop-menu 18:xdg-open 19:identify 20:convert 21:mkfs.ext3 22:mkfs.ext2 23:mlabel 24:md5sum 25:parted 26:mkdosfs 27:tune2fs 28:cryptsetup 29:rsync 30:sudo 31:lzma 32:unlzma 33:hdparm 34:gzip 35:unzip 36:fatresize 37:xxd 38:mksquashfs 39:pgrep 40:xz 41:df 42:gvfs-mount
and I am compelled to do:
$ killall gtkdialog
The problem is in multisystem, indeed the string "Error: Missing: " is in /usr/local/share/multisystem/locale/en/LC_MESSAGES/multisystem.en.po:
#, sh-format
msgid "Erreur il manque: "
msgstr "Error: Missing: "
This is a gettext format for multilingual purpose. This string is used in:
/usr/local/share/multisystem/install.sh
/usr/local/share/multisystem/gui_multisystem.sh
I correct the problem by replacing in /usr/local/share/multisystem/gui_multisystem.sh the line
for i in $(grep -v "^#" <<<"${testlist}" | xargs)
by
for i in $testlist
Then it starts (but be careful: kill all processes related to multisystem before running it).
Sometimes it halts during startup: then run it as a root; it displays “no root”, accept the error, and then rerun it as a normal user: it should start correctly. But after that, there are a lot of bugs, : I decide to drop it (kept in /usr/local/src/packages/old).
However, it seems to have worked for some people: here, but there is no way for me. Definitely, to my mind a large program written in bash is not a good thing!
Thus I try the Yumi executable on Windows, in Virtual Box, but I have not achieved to detect my USB key in VirtualBox. After having installed an extension of Oracle here put in ~/local/non_src, I obtain “unknown peripheral”. It should however be possible according to here, but i prefer waiting for a better support for multiboot usb on linux (for sure it will come one day).
Update on August 30, 2013: I finally achieve to use Multisystem: I simply install it on a USB key, as any Linux distribution. To install other Linux distributions, boot Multisystem itself, download a Linux distribution from internet, and install it on the USB key currently connected, i.e. the key on which Multisystem is installed, such that at next boot the choice between Multisystem and other installed distributions will be proposed.
Update on October 23, 2015: to create the Multisystem USB key, I used “usb-creator-gtk”, which creates a FAT32 partition when choosing to remove the USB disk; and then it installs the last ISO found at here on the key (today: ms_lts_precise_r9.iso).

Sound

The sound does not work properly with the speakers of the laptop, as with my previous MSI, but it does work by using my own speakers, so for the time being I do not consider this problem.

Reading a blue ray disk

I have bought a blue ray, installed the various packages necessary to read an encrypted commercial DVD on Ubuntu, but VLC refuses to read it. After some investigation, I have found that it is not even capable of reading a DVD. To investigate the problem, I did:
$ dmesg
[...]
Media region code is mismatched to logical unit region.
[...]
Indeed, commercial DVD have a “region” field, and the DVD reader must match this field.
Looking on forums, it seems that this problem should not appear: libdvdcss is normally able to read any DVD. Indeed, I read without problem “Taken” from Spielberg in the past, which is a North America region DVD. After more search, it seems that there is a utility “regionset” on Ubuntu, which allows to know and change the region of a DVD player: only five changes are allowed in the life of the DVD reader. In my case, the blue ray reader has no region associated to it (no number is given by “regionset”, exactly as my external DVD player which is able to read any DVD. So, I have tried (with a DVD inside the blue ray reader):
$ regionset /dev/sr0
and put “2” as region in the interactive process. After that, the blue ray was able to read DVD, but not “Taken”. So, maybe it was not a good idea after all: maybe constructors of computer readers put no region inside the controller, such that people may read a DVD from any region. By changing it to “2”, I am now able to read commercial DVD, but as it seems it is not possible to change back the region to “unset”, I probably will never be able to read “Taken” with this region setting. This is a pity if this problem was only a bug in libdvdcss. I do not install regionset by default in Sycomore, because it is a dangerous operation.
Note that after having changed the region, I have been compelled to remove ~/.dvdcss which contained crypting information pertaining to the previous “regionset” configuration. Without removing this directory, the DVD played, but it resulted in a garbled image.
Update on april 15, 2013: the problem is probably related to the fact that I have a Matshita drive: here. There is a technique to make a Matshita player region-free: here. In this page, I read what corresponds to my situation: “Apple ships Macbooks with drives without a region set. However, before you watch the movie, you have to set a region. Later you can change it but not more then four times.”. See also here, here, and here for the list of available firmwares. I have saved the page at /home/jscordia/Documents/documentation/work/computer/hardware/Matshita_dvdreader/ to compare with future versions where my DVD player may appear:
$ sudo lshw -C disk
*-cdrom
description: DVD-RAM writer
product: BD-CMB UJ141AF
vendor: MATSHITA
physical id: 0.0.0
bus info: scsi@2:0.0.0
logical name: /dev/cdrom
logical name: /dev/cdrw
logical name: /dev/dvd
logical name: /dev/dvdrw
logical name: /dev/sr0
logical name: /media/TAKEN
version: 1.00
capabilities: removable audio cd-r cd-rw dvd dvd-r dvd-ram
configuration: ansiversion=5 mount.fstype=udf mount.options=ro,nosuid,nodev,relatime,uid=1000,gid=1000,umask=77,dmode=500,iocharset=utf8 state=mounted status=ready

Java

I have removed the package “sun-java6-jre” from Sycomore list of packages; instead I have put “default-jre”.

Graphic card

The display is correct after the initial installation, without NVIDIA drivers but the “Nouveau” driver, except that the image is a bit garbled at the top of the screen, on the desktop, below the windows:\begin_inset Separator latexpar\end_inset
kubuntu12.10.garbled.png
The problem disappears when desktop effects are disabled.
With glxgears, we obtain:
$ glxgears 
Running synchronized to the vertical refresh. The framerate should be
approximately the same as the monitor refresh rate.
296 frames in 5.0 seconds = 59.148 FPS
300 frames in 5.0 seconds = 59.826 FPS
297 frames in 5.0 seconds = 59.228 FPS
Note that on the laptop, the LED corresponding to the NVIDIA card is on, not the one corresponding to the integrated Intel GPU (see laptop manual). I want to have a better NVIDIA support (e.g. for CUDA computation), so my first thought was to install the NVIDIA driver.

Various tests that did not work

I have tried to install “nvidia-current”, restart the X server; then “nvidia-settings” command gives a message indicating that it does not seem that the Nvidia driver is installed, and ask to do “nvidia-xconfig” as root (this behavior will be explained below). I do that, and obtain a Xorg.conf file in /etc/X11. However, after a logout, the resolution is very bad, and glxgears returns an error. A deletion of Xorg.conf allows to go back. After that, I have tried to remove the nouveau driver: “xserver-xorg-video-nouveau”, and create again the Xorg.conf with nvidia-xconfig: same problem.
So, I tried to install the NVidia drivers from NVidia website: uninstall “nvidia-current”, and then follow the procedure already used in the installation of NVidia drivers on my work Dell laptop (see in the past): reboot Linux in recovery mode, run a root shell prompt (note that I had to wait a long time before the recovery menu appeared, contrary to other computers I tried), and typed:
# $ telinit 3 $ is not needed, as in my previous installation on work Dell laptop
$ mount /usr/local
$ cd /usr/local
$ mount -o remount,rw /
$ ./NVIDIA....run # archive name
Then reboot, but it did not work better: low resolution, and nvidia-settings telling that the NVidia card is not used. To uninstall, I typed:
$ nvidia-uninstall
But after a reboot, no more X. Reinstallation of “xserver-xorg-video-nouveau”, and reboot: same problem. In fact, “startx” is working, but runs a strange interface with few features (maybe a failsafe version of Unity?). However, the file /etc/X11/default-display-manager is correct, it contains the string “/usr/sbin/lightdm”. Anyway, “$ start lightdm” does not work, it yields errors. So I decide to install “nvidia-experimental-304”. Now it works as after the initial Kubuntu installation, lightdm is correctly run, but if I create a xorg.conf file with “nvidia-xconfig”, I have the same problem than before, with the additional problem that glxgears does not work.

Final bumblebee installation

Finally, I find a post here by ArchangeGabriel explaining the working of the Linux implementation of the NVidia optimus technology: this is the bumblebee project here. The author is clear:
Donc par défaut, sous Linux, les deux cartes graphiques sont allumées en permanence et consomment, alors que seule l’intégrée est utilisée. Et cela est dommageable pour l’autonomie, la carte graphique pouvant représenter jusqu’à 50% de la consommation de l’ordinateur. Il est de plus impossible d’utiliser la carte graphique dédiée, l’installation des pilotes propriétaires (développés par nVidia) résultant au mieux par aucun changement, au pire par une impossibilité pour le système de démarrer correctement. Il ne faut donc surtout pas procéder à leur installation manuelle, et laisser Bumblebee s’en charger.
The problem would not exist if I had a processor without an integrated GPU: it seems that some i7 are delivered without integrated GPU, but certainly not all, because:
A very interesting FAQ is appended to the article of ArchangeGabriel mentioned above. The fact that it is necessary to use “$ optirun PROG” to run a program on the NVidia card is not a Linux restriction, it works exactly like that on Windows:
Ce n’est pas très pratique de devoir lancer toutes ses applications 3D via le terminal. Ne peut-on pas avoir un mode automatique ?
C’est en fait exactement ce que font les drivers nVidia sous Windows : ils comparent les applications lancées avec une liste, et si les applications apparaissent dans la liste, alors elles doivent tourner sur la carte nVidia. Nous travaillons également à une implémentation de ce système. Cela fait partie du projet bumblebee-ui
Par ailleurs, vous pouvez modifier la commande des raccourcis de vos bureaux/menus/lanceurs et ajouter optirun devant le contenu présent.
or, approximately in English here:
Why do I need to launch applications using optirun in a terminal? Isn’t there an easier way to do so?
In fact there is a way to launch applications without a terminal. You need to edit your .desktop file of that application. We deliberately don’t show you how to do it because if you know, you don’t need any more information. If you do it wrong you may break something or cause some annoyances. There is a work in progress on this matter involving bumblebee-ui for convenience and stability.
The fact that I obtained error messages from “nvidia-settings” telling that the driver is not used may be understood, because:
Le système indique que le driver nvidia est installé mais pas activé
C’est normal, car c’est la réalité. Bumblebee utilises le driver nvidia et l’a donc installé. Cependant, ce driver n’est chargé que lorsqu’il est utilisé, c’est-à-dire lorsque optirun tourne, car ce n’est pas lui qui est responsable de l’affichage. Rien d’alarmant donc.
Je n’arrive pas à accéder au Panneau de Configuration nVidia (nvidia-settings) (ou il me dit que je n’utilise pas la carte nvidia). Que faire ?
Tout d’abord, si on vous conseille d’utiliser nvidia-xconfig, ne le faites pas ! Venez en premier lieu demander sur ce topic si c’est approprié.
En fait, le Panneau de Configuration nVidia [nvidia-settings] ne détecte votre carte que si le fichier xorg.conf utilisé est celui de celle-ci et que de plus le serveur X.org sur lequel le Panneau tourne est géré par cette carte. Ces conditions font qu’il n’est pas possible d’utiliser le Panneau de Configuration nVidia tel quel pour le moment, nous réfléchissons aux différentes options qui s’offrent à nous pour contourner ce problème.
En attendant, vous pouvez tout de même y accéder, en utilisant la commande suivante :
$ optirun nvidia-settings -c :8
(Utilisateurs avancés, si vous avez changé la valeur de VGL_DISPLAY, il faut la changer ici aussi)
So we understand also why no xorg.conf should be here. Note that I don’t need nvidia-settings, at least for multiple display configuration, because KDE display settings work perfectly.
Note that it is not easy to use only the NVidia card:
Ne peut-on pas utiliser exclusivement la carte nVidia, et éteindre la carte Intel ?
Non, cela n’est pas possible pour une raison simple : c’est matériellement impossible. La carte nVidia et la carte Intel sont matériellement connectées entre elles, et seule la carte Intel est connectée à l’écran. Donc l’affichage passe forcément par la carte Intel, il n’est donc pas possible de la désactiver. Au mieux, il sera possible dans un futur lointain de tout faire tourner sur la carte nVidia de et n’utiliser la carte Intel que pour l’affichage à l’écran.
The author of this post also says that some BIOS allow to disactivate the integrated GPU or the dedicated GPU. But not my BIOS.
Finally, I found a working repository for Bumblebee, following the procedure given on the Bumblebee website:
$ sudo add-apt-repository ppa:bumblebee/stable
$ sudo apt-get update
$ sudo apt-get install bumblebee bumblebee-nvidia
Even if it works (see below), I do not add it to my Sycomore script (just after Medibuntu, around line 420) because it will surely be integrated in future Ubuntu distributions. So, for the time being, I have commented out the corresponding lines.
After a reboot, it works perfectly: instead of the NVIDIA led of the laptop, the integrated GPU led is on. glxgears works perfectly:
$ glxgears 
Running synchronized to the vertical refresh.  The framerate should be
approximately the same as the monitor refresh rate.
296 frames in 5.0 seconds = 59.063 FPS
302 frames in 5.0 seconds = 60.262 FPS
298 frames in 5.0 seconds = 59.468 FPS
And now, if I use the NVidia card:
$ optirun glxgears 
3427 frames in 5.0 seconds = 685.399 FPS
3572 frames in 5.0 seconds = 714.297 FPS
3563 frames in 5.0 seconds = 712.447 FPS
3576 frames in 5.0 seconds = 715.082 FPS
3399 frames in 5.0 seconds = 679.621 FPS
During this execution encapsulated by optirun, the LED corresponding to the NVIDIA card is on, and the integrated GPU LED is off. When doing CTRL+C, they switch their state, going back in the initial configuration. Now, I have a question about glxgears: in the first case, the framerate is limited to monitor refresh rate, and not in the second? I remember the OpenSuse message prepended to any glxgears output, that glxgears is not a meaningful test (confirmed by this). Indeed, in the past I obtained larger performance with my MSI ATI GPU: 32691 frames in 5.0 seconds (see note of 31 octobre 2009). It is not probable that the ATI card be more powerful; in fact it is surely far below the performance of GT650M card.
Note that the glxgears performance is similar to the one given at the above address, by an internaut here:
glxgears = 60.230 FPS
optirun64 glxgears = 324 or 296 FPS
Now, ArchangeGabriel advised to use glxspheres instead:
$ glxspheres
Polygons in scene: 62464
Visual ID of window: 0xd2
Context is Direct
OpenGL Renderer: Mesa DRI Intel(R) Ivybridge Mobile 
56.232539 frames/sec - 62.755514 Mpixels/sec
60.240235 frames/sec - 67.228102 Mpixels/sec
$ optirun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: GeForce GT 650M/PCIe/SSE2
82.194175 frames/sec - 91.728699 Mpixels/sec
90.052305 frames/sec - 100.498372 Mpixels/sec
So the performances are better with the NVidia card. The integrated GPU is not so bad on this test; but it is true that this integrated GPU allows to play recent games as Skyrim: .
Note that the Renderer is reported to be “Mesa DRI Intel(R) Ivybridge Mobile”, which is in accordance with the following facts:
$ wajig describe -v libgl1-mesa-glx:amd64
libgl1-mesa-glx: free implementation of the OpenGL API — GLX runtime
Mesa is a 3-D graphics library with an API which is very similar to that of OpenGL.  To the extent that Mesa utilizes the OpenGL command syntax or state machine, it is being used with authorization from Silicon Graphics, Inc.  However, the author makes no claim that Mesa is in any way a compatible replacement for OpenGL or associated with Silicon Graphics, Inc.
This version of Mesa provides GLX and DRI capabilities: it is capable of both direct and indirect rendering.  For direct rendering, it can use DRI modules from the libgl1-mesa-dri package to accelerate drawing.
This package does not include the modules themselves: these can be found in the libgl1-mesa-dri package.
There is a tool available in Ubuntu: glmark2. On the integrated GPU:
$ glmark2
=======================================================
    glmark2 2012.08
=======================================================
    OpenGL Information
    GL_VENDOR:     Intel Open Source Technology Center
    GL_RENDERER:   Mesa DRI Intel(R) Ivybridge Mobile 
    GL_VERSION:    3.0 Mesa 9.0
=======================================================
[build] use-vbo=false: FPS: 1120 FrameTime: 0.893 ms
[build] use-vbo=true: FPS: 1186 FrameTime: 0.843 ms
[texture] texture-filter=nearest: FPS: 1213 FrameTime: 0.824 ms
[texture] texture-filter=linear: FPS: 1187 FrameTime: 0.842 ms
[texture] texture-filter=mipmap: FPS: 1160 FrameTime: 0.862 ms
[shading] shading=gouraud: FPS: 984 FrameTime: 1.016 ms
[shading] shading=blinn-phong-inf: FPS: 1210 FrameTime: 0.826 ms
[shading] shading=phong: FPS: 1194 FrameTime: 0.838 ms
[bump] bump-render=high-poly: FPS: 801 FrameTime: 1.248 ms
[bump] bump-render=normals: FPS: 1201 FrameTime: 0.833 ms
[bump] bump-render=height: FPS: 1171 FrameTime: 0.854 ms
[effect2d] kernel=0,1,0;1,-4,1;0,1,0;: FPS: 812 FrameTime: 1.232 ms
[effect2d] kernel=1,1,1,1,1;1,1,1,1,1;1,1,1,1,1;: FPS: 343 FrameTime: 2.915 ms
[pulsar] light=false:quads=5:texture=false: FPS: 1056 FrameTime: 0.947 ms
[desktop] blur-radius=5:effect=blur:passes=1:separable=true:windows=4: FPS: 323 FrameTime: 3.096 ms
[desktop] effect=shadow:windows=4: FPS: 570 FrameTime: 1.754 ms
[buffer] columns=200:interleave=false:update-dispersion=0.9:update-fraction=0.5:update-method=map: FPS: 414 FrameTime: 2.415 ms
[buffer] columns=200:interleave=false:update-dispersion=0.9:update-fraction=0.5:update-method=subdata: FPS: 375 FrameTime: 2.667 ms
[buffer] columns=200:interleave=true:update-dispersion=0.9:update-fraction=0.5:update-method=map: FPS: 431 FrameTime: 2.320 ms
[ideas] speed=duration: FPS: 889 FrameTime: 1.125 ms
[jellyfish] <default>: FPS: 650 FrameTime: 1.538 ms
[terrain] <default>: FPS: 89 FrameTime: 11.236 ms
[conditionals] fragment-steps=0:vertex-steps=0: FPS: 1157 FrameTime: 0.864 ms
[conditionals] fragment-steps=5:vertex-steps=0: FPS: 1162 FrameTime: 0.861 ms
[conditionals] fragment-steps=0:vertex-steps=5: FPS: 1163 FrameTime: 0.860 ms
[function] fragment-complexity=low:fragment-steps=5: FPS: 1161 FrameTime: 0.861 ms
[function] fragment-complexity=medium:fragment-steps=5: FPS: 1160 FrameTime: 0.862 ms
[loop] fragment-loop=false:fragment-steps=5:vertex-steps=5: FPS: 1162 FrameTime: 0.861 ms
[loop] fragment-steps=5:fragment-uniform=false:vertex-steps=5: FPS: 1166 FrameTime: 0.858 ms
[loop] fragment-steps=5:fragment-uniform=true:vertex-steps=5: FPS: 1169 FrameTime: 0.855 ms
=======================================================
                                  glmark2 Score: 922 
=======================================================

I have not tried to do the computation, but it seems that the score is simply the average of the FPS of the various tests. Now, if I try the NVidia card:
$ optirun glmark2
* GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
=======================================================
    glmark2 2012.08
=======================================================
    OpenGL Information
    GL_VENDOR:     NVIDIA Corporation
    GL_RENDERER:   GeForce GT 650M/PCIe/SSE2
    GL_VERSION:    4.2.0 NVIDIA 304.43
=======================================================
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[build] use-vbo=false: FPS: 206 FrameTime: 4.854 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[build] use-vbo=true: FPS: 238 FrameTime: 4.202 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[texture] texture-filter=nearest: FPS: 238 FrameTime: 4.202 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[texture] texture-filter=linear: FPS: 238 FrameTime: 4.202 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[texture] texture-filter=mipmap: FPS: 237 FrameTime: 4.219 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[shading] shading=gouraud: FPS: 237 FrameTime: 4.219 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[shading] shading=blinn-phong-inf: FPS: 238 FrameTime: 4.202 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[shading] shading=phong: FPS: 238 FrameTime: 4.202 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[bump] bump-render=high-poly: FPS: 237 FrameTime: 4.219 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[bump] bump-render=normals: FPS: 238 FrameTime: 4.202 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[bump] bump-render=height: FPS: 236 FrameTime: 4.237 ms
** GLX does not support GLX_EXT_swap_control or GLX_MESA_swap_control!
** Failed to set swap interval. Results may be bounded above by refresh rate.
[effect2d] kernel=0,1,0;1,-4,1;0,1,0;: FPS: 229 FrameTime: 4.367 ms
=======================================================
                                  glmark2 Score: 234 
=======================================================
So, we have a problem. Indeed, the FPS are very low. And indeed, these extensions are missing, but there is GLX_MESA_swap_control in the integrated GPU:
$ glxinfo
name of display: :0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
server glx extensions:
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
    GLX_ARB_multisample, GLX_EXT_create_context_es2_profile, 
    GLX_EXT_import_context, GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_MESA_copy_sub_buffer, GLX_OML_swap_method, 
    GLX_SGI_swap_control, GLX_SGIS_multisample, GLX_SGIX_fbconfig, 
    GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, GLX_INTEL_swap_event
client glx vendor string: Mesa Project and SGI
client glx version string: 1.4
client glx extensions:
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
    GLX_ARB_create_context_robustness, GLX_ARB_get_proc_address, 
    GLX_ARB_multisample, GLX_EXT_import_context, GLX_EXT_visual_info, 
    GLX_EXT_visual_rating, GLX_EXT_framebuffer_sRGB, 
    GLX_EXT_create_context_es2_profile, GLX_MESA_copy_sub_buffer, 
    GLX_MESA_multithread_makecurrent, GLX_MESA_swap_control, 
    GLX_OML_swap_method, GLX_OML_sync_control, GLX_SGI_make_current_read, 
    GLX_SGI_swap_control, GLX_SGI_video_sync, GLX_SGIS_multisample, 
    GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, 
    GLX_EXT_texture_from_pixmap, GLX_INTEL_swap_event
GLX version: 1.4
GLX extensions:
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_import_context, 
    GLX_EXT_visual_info, GLX_EXT_visual_rating, 
    GLX_EXT_create_context_es2_profile, GLX_MESA_copy_sub_buffer, 
    GLX_MESA_multithread_makecurrent, GLX_MESA_swap_control, 
    GLX_OML_swap_method, GLX_OML_sync_control, GLX_SGI_make_current_read, 
    GLX_SGI_swap_control, GLX_SGI_video_sync, GLX_SGIS_multisample, 
    GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGIX_visual_select_group, 
    GLX_EXT_texture_from_pixmap
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile 
OpenGL version string: 3.0 Mesa 9.0
OpenGL shading language version string: 1.30
OpenGL extensions:
    GL_ARB_multisample, GL_EXT_abgr, GL_EXT_bgra, GL_EXT_blend_color, 
    GL_EXT_blend_minmax, GL_EXT_blend_subtract, GL_EXT_copy_texture, 
    GL_EXT_polygon_offset, GL_EXT_subtexture, GL_EXT_texture_object, 
    GL_EXT_vertex_array, GL_EXT_compiled_vertex_array, GL_EXT_texture, 
    GL_EXT_texture3D, GL_IBM_rasterpos_clip, GL_ARB_point_parameters, 
    GL_EXT_draw_range_elements, GL_EXT_packed_pixels, GL_EXT_point_parameters, 
    GL_EXT_rescale_normal, GL_EXT_separate_specular_color, 
    GL_EXT_texture_edge_clamp, GL_SGIS_generate_mipmap, 
    GL_SGIS_texture_border_clamp, GL_SGIS_texture_edge_clamp, 
    GL_SGIS_texture_lod, GL_ARB_framebuffer_sRGB, GL_ARB_multitexture, 
    GL_EXT_framebuffer_sRGB, GL_IBM_multimode_draw_arrays, 
    GL_IBM_texture_mirrored_repeat, GL_3DFX_texture_compression_FXT1, 
    GL_ARB_texture_cube_map, GL_ARB_texture_env_add, GL_ARB_transpose_matrix, 
    GL_EXT_blend_func_separate, GL_EXT_fog_coord, GL_EXT_multi_draw_arrays, 
    GL_EXT_secondary_color, GL_EXT_texture_env_add, 
    GL_EXT_texture_filter_anisotropic, GL_EXT_texture_lod_bias, 
    GL_INGR_blend_func_separate, GL_NV_blend_square, GL_NV_light_max_exponent, 
    GL_NV_texgen_reflection, GL_NV_texture_env_combine4, GL_S3_s3tc, 
    GL_SUN_multi_draw_arrays, GL_ARB_texture_border_clamp, 
    GL_ARB_texture_compression, GL_EXT_framebuffer_object, 
    GL_EXT_texture_compression_s3tc, GL_EXT_texture_env_combine, 
    GL_EXT_texture_env_dot3, GL_MESA_window_pos, GL_NV_packed_depth_stencil, 
    GL_NV_texture_rectangle, GL_NV_vertex_program, GL_ARB_depth_texture, 
    GL_ARB_occlusion_query, GL_ARB_shadow, GL_ARB_texture_env_combine, 
    GL_ARB_texture_env_crossbar, GL_ARB_texture_env_dot3, 
    GL_ARB_texture_mirrored_repeat, GL_ARB_window_pos, GL_ATI_envmap_bumpmap, 
    GL_EXT_stencil_two_side, GL_EXT_texture_cube_map, GL_NV_depth_clamp, 
    GL_NV_vertex_program1_1, GL_APPLE_packed_pixels, 
    GL_APPLE_vertex_array_object, GL_ARB_draw_buffers, 
    GL_ARB_fragment_program, GL_ARB_fragment_shader, GL_ARB_shader_objects, 
    GL_ARB_vertex_program, GL_ARB_vertex_shader, GL_ATI_draw_buffers, 
    GL_ATI_texture_env_combine3, GL_ATI_texture_float, GL_EXT_shadow_funcs, 
    GL_EXT_stencil_wrap, GL_MESA_pack_invert, GL_MESA_ycbcr_texture, 
    GL_NV_primitive_restart, GL_ARB_depth_clamp, 
    GL_ARB_fragment_program_shadow, GL_ARB_half_float_pixel, 
    GL_ARB_occlusion_query2, GL_ARB_point_sprite, GL_ARB_shading_language_100, 
    GL_ARB_sync, GL_ARB_texture_non_power_of_two, GL_ARB_vertex_buffer_object, 
    GL_ATI_blend_equation_separate, GL_EXT_blend_equation_separate, 
    GL_OES_read_format, GL_ARB_color_buffer_float, GL_ARB_pixel_buffer_object, 
    GL_ARB_texture_compression_rgtc, GL_ARB_texture_float, 
    GL_ARB_texture_rectangle, GL_EXT_packed_float, GL_EXT_pixel_buffer_object, 
    GL_EXT_texture_compression_dxt1, GL_EXT_texture_compression_rgtc, 
    GL_EXT_texture_rectangle, GL_EXT_texture_sRGB, 
    GL_EXT_texture_shared_exponent, GL_ARB_framebuffer_object, 
    GL_EXT_framebuffer_blit, GL_EXT_framebuffer_multisample, 
    GL_EXT_packed_depth_stencil, GL_APPLE_object_purgeable, 
    GL_ARB_vertex_array_object, GL_ATI_separate_stencil, GL_EXT_draw_buffers2, 
    GL_EXT_draw_instanced, GL_EXT_gpu_program_parameters, 
    GL_EXT_texture_array, GL_EXT_texture_integer, GL_EXT_texture_sRGB_decode, 
    GL_EXT_timer_query, GL_OES_EGL_image, GL_MESA_texture_array, 
    GL_ARB_copy_buffer, GL_ARB_depth_buffer_float, GL_ARB_draw_instanced, 
    GL_ARB_half_float_vertex, GL_ARB_instanced_arrays, 
    GL_ARB_map_buffer_range, GL_ARB_texture_rg, GL_ARB_texture_swizzle, 
    GL_ARB_vertex_array_bgra, GL_EXT_separate_shader_objects, 
    GL_EXT_texture_swizzle, GL_EXT_vertex_array_bgra, 
    GL_NV_conditional_render, GL_AMD_draw_buffers_blend, 
    GL_ARB_ES2_compatibility, GL_ARB_blend_func_extended, GL_ARB_debug_output, 
    GL_ARB_draw_buffers_blend, GL_ARB_draw_elements_base_vertex, 
    GL_ARB_explicit_attrib_location, GL_ARB_fragment_coord_conventions, 
    GL_ARB_provoking_vertex, GL_ARB_sampler_objects, GL_ARB_seamless_cube_map, 
    GL_ARB_shader_texture_lod, GL_ARB_texture_rgb10_a2ui, 
    GL_ARB_uniform_buffer_object, GL_EXT_provoking_vertex, 
    GL_EXT_texture_snorm, GL_MESA_texture_signed_rgba, GL_ARB_robustness, 
    GL_ARB_shader_bit_encoding, GL_ARB_texture_storage, 
    GL_EXT_transform_feedback, GL_ARB_invalidate_subdata
$ optirun glxinfo 
name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: VirtualGL
server glx version string: 1.4
server glx extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
GLX_EXT_visual_rating, GLX_SGI_make_current_read, GLX_SGIX_fbconfig, 
GLX_SGIX_pbuffer, GLX_SUN_get_transparent_index, GLX_ARB_create_context, 
GLX_ARB_create_context_profile
client glx vendor string: VirtualGL
client glx version string: 1.4
client glx extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
GLX_EXT_visual_rating, GLX_SGI_make_current_read, GLX_SGIX_fbconfig, 
GLX_SGIX_pbuffer, GLX_SUN_get_transparent_index, GLX_ARB_create_context, 
GLX_ARB_create_context_profile
GLX version: 1.4
GLX extensions:
GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, 
GLX_EXT_visual_rating, GLX_SGI_make_current_read, GLX_SGIX_fbconfig, 
GLX_SGIX_pbuffer, GLX_SUN_get_transparent_index, GLX_ARB_create_context, 
GLX_ARB_create_context_profile
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GT 650M/PCIe/SSE2
OpenGL version string: 4.2.0 NVIDIA 304.43
OpenGL shading language version string: 4.20 NVIDIA via Cg compiler
OpenGL extensions:
GL_AMD_multi_draw_indirect, GL_AMD_seamless_cubemap_per_texture, 
GL_ARB_base_instance, GL_ARB_blend_func_extended, 
GL_ARB_color_buffer_float, GL_ARB_compatibility, 
GL_ARB_compressed_texture_pixel_storage, GL_ARB_conservative_depth, 
GL_ARB_copy_buffer, GL_ARB_depth_buffer_float, GL_ARB_depth_clamp, 
GL_ARB_depth_texture, GL_ARB_draw_buffers, GL_ARB_draw_buffers_blend, 
GL_ARB_draw_indirect, GL_ARB_draw_elements_base_vertex, 
GL_ARB_draw_instanced, GL_ARB_ES2_compatibility, 
GL_ARB_explicit_attrib_location, GL_ARB_fragment_coord_conventions, 
GL_ARB_fragment_program, GL_ARB_fragment_program_shadow, 
GL_ARB_fragment_shader, GL_ARB_framebuffer_object, 
GL_ARB_framebuffer_sRGB, GL_ARB_geometry_shader4, 
GL_ARB_get_program_binary, GL_ARB_gpu_shader5, GL_ARB_gpu_shader_fp64, 
GL_ARB_half_float_pixel, GL_ARB_half_float_vertex, GL_ARB_imaging, 
GL_ARB_instanced_arrays, GL_ARB_internalformat_query, 
GL_ARB_map_buffer_alignment, GL_ARB_map_buffer_range, GL_ARB_multisample, 
GL_ARB_multitexture, GL_ARB_occlusion_query, GL_ARB_occlusion_query2, 
GL_ARB_pixel_buffer_object, GL_ARB_point_parameters, GL_ARB_point_sprite, 
GL_ARB_provoking_vertex, GL_ARB_robustness, GL_ARB_sample_shading, 
GL_ARB_sampler_objects, GL_ARB_seamless_cube_map, 
GL_ARB_separate_shader_objects, GL_ARB_shader_atomic_counters, 
GL_ARB_shader_bit_encoding, GL_ARB_shader_image_load_store, 
GL_ARB_shader_objects, GL_ARB_shader_precision, GL_ARB_shader_subroutine, 
GL_ARB_shader_texture_lod, GL_ARB_shading_language_100, 
GL_ARB_shading_language_420pack, GL_ARB_shading_language_include, 
GL_ARB_shading_language_packing, GL_ARB_shadow, GL_ARB_sync, 
GL_ARB_tessellation_shader, GL_ARB_texture_border_clamp, 
GL_ARB_texture_buffer_object, GL_ARB_texture_buffer_object_rgb32, 
GL_ARB_texture_compression, GL_ARB_texture_compression_bptc, 
GL_ARB_texture_compression_rgtc, GL_ARB_texture_cube_map, 
GL_ARB_texture_cube_map_array, GL_ARB_texture_env_add, 
GL_ARB_texture_env_combine, GL_ARB_texture_env_crossbar, 
GL_ARB_texture_env_dot3, GL_ARB_texture_float, GL_ARB_texture_gather, 
GL_ARB_texture_mirrored_repeat, GL_ARB_texture_multisample, 
GL_ARB_texture_non_power_of_two, GL_ARB_texture_query_lod, 
GL_ARB_texture_rectangle, GL_ARB_texture_rg, GL_ARB_texture_rgb10_a2ui, 
GL_ARB_texture_storage, GL_ARB_texture_swizzle, GL_ARB_timer_query, 
GL_ARB_transform_feedback2, GL_ARB_transform_feedback3, 
GL_ARB_transform_feedback_instanced, GL_ARB_transpose_matrix, 
GL_ARB_uniform_buffer_object, GL_ARB_vertex_array_bgra, 
GL_ARB_vertex_array_object, GL_ARB_vertex_attrib_64bit, 
GL_ARB_vertex_buffer_object, GL_ARB_vertex_program, GL_ARB_vertex_shader, 
GL_ARB_vertex_type_2_10_10_10_rev, GL_ARB_viewport_array, 
GL_ARB_window_pos, GL_ATI_draw_buffers, GL_ATI_texture_float, 
GL_ATI_texture_mirror_once, GL_S3_s3tc, GL_EXT_texture_env_add, 
GL_EXT_abgr, GL_EXT_bgra, GL_EXT_bindable_uniform, GL_EXT_blend_color, 
GL_EXT_blend_equation_separate, GL_EXT_blend_func_separate, 
GL_EXT_blend_minmax, GL_EXT_blend_subtract, GL_EXT_compiled_vertex_array, 
GL_EXT_Cg_shader, GL_EXT_depth_bounds_test, GL_EXT_direct_state_access, 
GL_EXT_draw_buffers2, GL_EXT_draw_instanced, GL_EXT_draw_range_elements, 
GL_EXT_fog_coord, GL_EXT_framebuffer_blit, GL_EXT_framebuffer_multisample, 
GL_EXTX_framebuffer_mixed_formats, GL_EXT_framebuffer_object, 
GL_EXT_framebuffer_sRGB, GL_EXT_geometry_shader4, 
GL_EXT_gpu_program_parameters, GL_EXT_gpu_shader4, 
GL_EXT_multi_draw_arrays, GL_EXT_packed_depth_stencil, 
GL_EXT_packed_float, GL_EXT_packed_pixels, GL_EXT_pixel_buffer_object, 
GL_EXT_point_parameters, GL_EXT_provoking_vertex, GL_EXT_rescale_normal, 
GL_EXT_secondary_color, GL_EXT_separate_shader_objects, 
GL_EXT_separate_specular_color, GL_EXT_shader_image_load_store, 
GL_EXT_shadow_funcs, GL_EXT_stencil_two_side, GL_EXT_stencil_wrap, 
GL_EXT_texture3D, GL_EXT_texture_array, GL_EXT_texture_buffer_object, 
GL_EXT_texture_compression_dxt1, GL_EXT_texture_compression_latc, 
GL_EXT_texture_compression_rgtc, GL_EXT_texture_compression_s3tc, 
GL_EXT_texture_cube_map, GL_EXT_texture_edge_clamp, 
GL_EXT_texture_env_combine, GL_EXT_texture_env_dot3, 
GL_EXT_texture_filter_anisotropic, GL_EXT_texture_format_BGRA8888, 
GL_EXT_texture_integer, GL_EXT_texture_lod, GL_EXT_texture_lod_bias, 
GL_EXT_texture_mirror_clamp, GL_EXT_texture_object, 
GL_EXT_texture_shared_exponent, GL_EXT_texture_sRGB, 
GL_EXT_texture_sRGB_decode, GL_EXT_texture_storage, 
GL_EXT_texture_swizzle, GL_EXT_texture_type_2_10_10_10_REV, 
GL_EXT_timer_query, GL_EXT_transform_feedback2, GL_EXT_vertex_array, 
GL_EXT_vertex_array_bgra, GL_EXT_vertex_attrib_64bit, 
GL_EXT_x11_sync_object, GL_EXT_import_sync_object, GL_IBM_rasterpos_clip, 
GL_IBM_texture_mirrored_repeat, GL_KTX_buffer_region, GL_NV_alpha_test, 
GL_NV_bindless_texture, GL_NV_blend_minmax, GL_NV_blend_square, 
GL_NV_complex_primitives, GL_NV_conditional_render, 
GL_NV_copy_depth_to_color, GL_NV_copy_image, GL_NV_depth_buffer_float, 
GL_NV_depth_clamp, GL_NV_ES1_1_compatibility, GL_NV_explicit_multisample, 
GL_NV_fbo_color_attachments, GL_NV_fence, GL_NV_float_buffer, 
GL_NV_fog_distance, GL_NV_fragdepth, GL_NV_fragment_program, 
GL_NV_fragment_program_option, GL_NV_fragment_program2, 
GL_NV_framebuffer_multisample_coverage, GL_NV_geometry_shader4, 
GL_NV_gpu_program4, GL_NV_gpu_program4_1, GL_NV_gpu_program5, 
GL_NV_gpu_program_fp64, GL_NV_gpu_shader5, GL_NV_half_float, 
GL_NV_light_max_exponent, GL_NV_multisample_coverage, 
GL_NV_multisample_filter_hint, GL_NV_occlusion_query, 
GL_NV_packed_depth_stencil, GL_NV_parameter_buffer_object, 
GL_NV_parameter_buffer_object2, GL_NV_path_rendering, 
GL_NV_pixel_data_range, GL_NV_point_sprite, GL_NV_primitive_restart, 
GL_NV_register_combiners, GL_NV_register_combiners2, 
GL_NV_shader_atomic_counters, GL_NV_shader_atomic_float, 
GL_NV_shader_buffer_load, GL_NV_texgen_reflection, GL_NV_texture_barrier, 
GL_NV_texture_compression_vtc, GL_NV_texture_env_combine4, 
GL_NV_texture_expand_normal, GL_NV_texture_lod_clamp, 
GL_NV_texture_multisample, GL_NV_texture_rectangle, GL_NV_texture_shader, 
GL_NV_texture_shader2, GL_NV_texture_shader3, GL_NV_transform_feedback, 
GL_NV_transform_feedback2, GL_NV_vdpau_interop, GL_NV_vertex_array_range, 
GL_NV_vertex_array_range2, GL_NV_vertex_attrib_integer_64bit, 
GL_NV_vertex_buffer_unified_memory, GL_NV_vertex_program, 
GL_NV_vertex_program1_1, GL_NV_vertex_program2, 
GL_NV_vertex_program2_option, GL_NV_vertex_program3, 
GL_NVX_conditional_render, GL_NVX_gpu_memory_info, 
GL_OES_compressed_paletted_texture, GL_OES_depth24, GL_OES_depth32, 
GL_OES_depth_texture, GL_OES_element_index_uint, GL_OES_fbo_render_mipmap, 
GL_OES_get_program_binary, GL_OES_mapbuffer, GL_OES_packed_depth_stencil, 
GL_OES_point_size_array, GL_OES_point_sprite, GL_OES_rgb8_rgba8, 
GL_OES_read_format, GL_OES_standard_derivatives, GL_OES_texture_3D, 
GL_OES_texture_float, GL_OES_texture_float_linear, 
GL_OES_texture_half_float, GL_OES_texture_half_float_linear, 
GL_OES_texture_npot, GL_OES_vertex_array_object, GL_OES_vertex_half_float, 
GL_SGIS_generate_mipmap, GL_SGIS_texture_lod, GL_SGIX_depth_texture, 
GL_SGIX_shadow, GL_SUN_slice_accum

I have to wait for another version of Bumblebee, or to use more recent version of NVidia drivers, available in the 12.10 version of the repository given at the same address here:
$ sudo add-apt-repository ppa:ubuntu-x-swat/x-updates
I have not a urgent need of OpenGL capabilities, to say the least, and this extension has probably no link with CUDA capabilities of the graphic card, so it is not a problem.
I have also tried the OpenGL demos of “qtdemo”: all are working with integrated GPU, only the “Pixel Buffers 2” is not perfect (white square). All are working perfectly with “optirun qtdemo”.
However, the problems of garbling mentioned at the start of this section about graphic card are always present, so I decide to desactivate the desktop effects. To be seen later.
The CUDA files are provided in Ubuntu, I have installed them and put in sycomore packages.
libcublas4                NVIDIA CUDA BLAS runtime library
libcudart4                NVIDIA CUDA runtime library
libcufft4                 NVIDIA CUDA FFT runtime library
libcuinj4                 NVIDIA CUDA INJ runtime library
libcupti4                 NVIDIA CUDA Profiler Tools Interface runtime library
libcupti-dev              NVIDIA CUDA Profiler Tools Interface development files
libcupti-doc              NVIDIA CUDA Profiler Tools Interface documentation
libcurand4                NVIDIA CUDA Random Numbers Generation runtime library
libcusparse4              NVIDIA CUDA Sparse Matrix runtime library
nvidia-cuda-dev           NVIDIA CUDA development files
nvidia-cuda-doc           NVIDIA CUDA and OpenCL documentation
nvidia-cuda-gdb           NVIDIA CUDA GDB
nvidia-cuda-toolkit       NVIDIA CUDA toolkit
python-pycuda             Python module to access Nvidiaâs CUDA parallel computation API
python-pycuda-doc         module to access Nvidiaâs CUDA parallel computation API (documentation)
python-pycuda-headers     headers for Python module to access Nvidiaâs CUDA parallel computation API

Printer and scanner MF4330d

The scanner works out of the box, for example via “skanlite”.
When I try to add a printer at “http://localhost:631”, the printer MF4330d is recognized as “Canon MF4320-4350 (UFRII LT)“, but the driver is not available, as expected: I have to install proprietary drivers. I download the last version, 2.50, and generate .deb from .rpm packages with “alien -k” as in previous releases, I obtain cndrvcups-common_2.50-1_amd64.deb and cndrvcups-ufr2-uk_2.50-1_amd64.deb. I install them, and try to add the printer in CUPS web interface. It does not work, the jobs are stopped, and in /var/cups/error_log, I have:
E [28/Oct/2012:18:50:44 +0100] Canon_MF4320-4350_: File "/usr/lib/cups/filter/pstoufr2cpca" not available: No such file or directory
W [28/Oct/2012:18:51:05 +0100] CreateProfile failed: org.freedesktop.ColorManager.AlreadyExists:profile id ’Canon_MF4320-4350_-Gray..’ already exists
W [28/Oct/2012:18:51:05 +0100] CreateDevice failed: org.freedesktop.ColorManager.AlreadyExists:device id ’cups-Canon_MF4320-4350_’ already exists
E [28/Oct/2012:18:51:05 +0100] Canon_MF4320-4350_: File "/usr/lib/cups/filter/pstoufr2cpca" not available: No such file or directory
E [28/Oct/2012:18:52:36 +0100] Canon_MF4320-4350_: File "/usr/lib/cups/filter/pstoufr2cpca" not available: No such file or directory
E [28/Oct/2012:18:52:36 +0100] [Job 1] Unable to start filter "pstoufr2cpca" - Success.
E [28/Oct/2012:18:52:36 +0100] [Job 1] Stopping job because the scheduler could not execute a filter.
E [28/Oct/2012:18:53:14 +0100] Canon_MF4320-4350_: File "/usr/lib/cups/filter/pstoufr2cpca" not available: No such file or directory
E [28/Oct/2012:18:53:14 +0100] [Job 2] Unable to start filter "pstoufr2cpca" - Success.
E [28/Oct/2012:18:53:14 +0100] [Job 2] Stopping job because the scheduler could not execute a filter.
E [28/Oct/2012:18:53:35 +0100] Canon_MF4320-4350_: File "/usr/lib/cups/filter/pstoufr2cpca" not available: No such file or directory
E [28/Oct/2012:18:53:35 +0100] [Job 2] Unable to start filter "pstoufr2cpca" - Success.
E [28/Oct/2012:18:53:35 +0100] [Job 2] Stopping job because the scheduler could not execute a filter.
Still a problem with pstoufr2cpca! The solution is probably the same as in the previous installation: create a symlink /usr/lib64 to /usr/lib before installing the .deb. After installation of the .deb, we have /usr/lib/cups/filter/pstoufr2cpca as an executable file. Now, it works perfectly!

Card reader

It does not work out of the box. But the two last lines of lspci give:
$ lspci
[...]
04:00.0 Unassigned class [ff00]: Realtek Semiconductor Co., Ltd. Device 5289 (rev 01)
04:00.2 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 0a)

And it is clear the first is the card reader: Google search. The problem can be solved like that: on a LDLC laptop.
I have just installed the .deb file (saved in /usr/local/packages), and rebooted, and it worked (update: works out of the box in Kubuntu 13.04).

Various

Sycomore.py

# No need to wait for the return of wajig here up to (K)ubuntu
# 12.04.
# But from (K)ubuntu 12.10, I obtain an error if I proceed like
# that:
# E: Problem renaming the file /var/cache/apt/pkgcache.bin.a7Hczi
# to /var/cache/apt/pkgcache.bin - rename (2: No such file or
# directory)
# W: You may want to run apt-get update to correct these
# problems
# apt-get update does not change anything, it is really due to
# the fact that we must wait for "wajig clean" to finish, before
# trying to install another package.

VirtualBox

I obtain an error when trying to run VirtualBox:
Failed to open a session for the virtual machine winXP.
cpum#1: CPU vendor mismatch: host=’GenuineIntel’
saved=’AuthenticAMD’ [ver=12 pass=final]
(VERR_SSM_LOAD_CPUID_MISMATCH).
Result Code: NS_ERROR_FAILURE (0x80004005)
Component: Console
Interface: IConsole {1968b7d3-e3bf-4ceb-99e0-cb7c913317bb}

The following webpage should help, but currently it does not work (SQL error): here.
Finally, the solution was much easier: the problem came from the fact that I had a saved state in my machine. To discard this state: right-click on the machine, and “discard saved state”. Then it was able to start successfully.

New internal hard drive

After 3 weeks of use, it seems that I have a problem with power management on /dev/sdb (internal hard drive):
$ smartctl -a /dev/sdb|\grep -i load
193 Load_Cycle_Count        0x0032   100   100   000    Old_age   Always       -       3936
222 Loaded_Hours            0x0032   099   099   000    Old_age   Always       -       404
223 Load_Retry_Count        0x0032   100   100   000    Old_age   Always       -       0
224 Load_Friction           0x0022   100   100   000    Old_age   Always       -       0
226 Load-in_Time            0x0026   100   100   000    Old_age   Always       -       258
The model is:
$ sudo hdparm -i /dev/sdb
/dev/sdb:
 Model=TOSHIBA MQ01ABD075, FwRev=AX001U, SerialNo=62K7P47OT
 Config={ Fixed }
 RawCHS=16383/16/63, TrkSize=0, SectSize=0, ECCbytes=0
 BuffType=unknown, BuffSize=8192kB, MaxMultSect=16, MultSect=16
 CurCHS=16383/16/63, CurSects=16514064, LBA=yes, LBAsects=1465149168
 IORDY=on/off, tPIO={min:120,w/IORDY:120}, tDMA={min:120,rec:120}
 PIO modes:  pio0 pio1 pio2 pio3 pio4 
 DMA modes:  sdma0 sdma1 sdma2 mdma0 mdma1 mdma2 
 UDMA modes: udma0 udma1 udma2 udma3 udma4 *udma5 
 AdvancedPM=yes: unknown setting WriteCache=enabled
 Drive conforms to: Unspecified:  ATA/ATAPI-3,4,5,6,7
 * signifies the current active mode

But if I do:
$ hdparm -B 255 /dev/sdb
/dev/sdb:
 setting Advanced Power Management level to disabled
 APM_level      = off
It does not work better: I have a lot of clicks and whistles. With my previous HDD, this workaround worked correctly (a Western Digital HDD in a MSI laptop). No more success with:
$ hdparm -B 1 /dev/sdb
or
$ hdparm -B 254 /dev/sdb
It does not work better when IDE instead of AHCI is activated in the BIOS. We find other informations on internet, concerning Toshiba drives: here and here. These pages tell that level 254, 250, or 192 are sometimes needed, and not 255. But I have tried these settings without more success.
However, I have discovered an interesting thing. When starting smartctl to display the number of load cycles, a whistle starts, and continues for several seconds. I have found that it is stopped automatically if the number used for “hdparm -B” is decreased to a sufficient value:
I thus decide to use these settings, because this whistle is very frequent on my machine, particularly at rest. But this does not solve the major problem with the Toshiba HDD: it loads and unloads all the time, contrary to the Seagate one.
Note that to make this change permanent, creating scripts with the glob format /etc/apm/*/99-*.sh does not work on my Kubuntu 12.10: instead of that, uncomment the line “apm = 255” in /etc/hdparm.conf, and put the correct setting.
Due to the Toshiba constant load/unload cycles, I have inverted these two hard drives: now I use the Toshiba as an external hard drive for backup, and the Seagate one in my laptop. The procedure has been relatively easy, because the Seagate one was already crypted and contained my /home folder:
After several days, I am rather satisfied with the new configuration. The Seagate HDD is probably more noisy, but permanently: there is no sudden noise that disturb concentration. This is not a surprise, because it was delivered as part of an external HDD, which is focused on backup where sound level is not so important. But with my hdparm settings, it makes far less noise than the Toshiba one!
#! CUT MATERIAL

Vim

With my current ~/.vimrc, I cannot benefit from boxes in a TODO file, for example with “,mb”. But a workaround is to open another file, for example ~/.vimrc: then “,mb” works correctly in TODO, with the syntax of .vimrc.

Processor, GPU, Playstation 3

There is a very powerful IBM Cell processor on the Playstation 3, which is not expensive (299€). It seems that this processor is far more powerful than intel i7. For example here:
The Cell processor is potentially more powerful than i7 but it requires lots of coding to work right. It was used in supercomputers (lots of cell processors in one supercomputer off course, no single processor makes a supercomputer)
Cell processor has a raw power of over 200 gigaflops while i7 is about 50.
or here:
Cell is very powerfull in single tasks but piss poor in multitasking, X86 CPU’s are average in single tasks but great in multitasking. Cell would suck running a Windows OS or something similar (as shown with Linux).
So, if Linux can be installed, it is an extraordinary powerful solution for computation. Some have used network of Playstation 3 to do computations, but Sony has tried to stop that at some time; but hackers have found a workaround: see here or here (“Le 25 mai 2011, la Team Rebugs, auteur du CFW Rebug 3.55, sort 2 outils servant à réintégrer l’Other OS sur les PS3 en cfw 3.55 rebug.”). Why a so powerful processor? Maybe the answer at (to be checked) here:
Le couple GPU+CPU d’un PC n’est rien avoir avec ce qui compose une console
Le chip graphique de la console et le CPU n’ont pas les mêmes rôles dans la console. Sur celle-ci le CPU est beaucoup plus important pour le traitement des graphismes, il est bien plus mis à contribution que le CPU du PC qui va deleguer un maximum au GPU (d’où son nom .. le GPU, alors qu’en fait on parle rarement de GPU pour une console, mais de chip graphique).
Le Chip graphique des PS3 et Xbox360 sont surtout là pour le rendu final et les traitements type antialiaising, texturing etc.
Donc ça ne sert à rien de dire "y’a un gros CPU et un GPU de me##e" dans une console contrairement au PC... parce que la repartition des traitements n’est pas la même.
Les gars qui calculent la puissance d’une console juste en additionnant des chiffres et des données techniques de puces n’ont decidemment rien compris ...
So, I think that the same power of computation can be reached if the GPU of the PC is used in the computations: we think e.g. to CUDA. It is the same opinion of a comment here:
you basically cannot compare the two because they are for different fields. the cell is for intense mathematical calculations (somewhat like a GPU i would guess) and the Xeon is made for everyday use and speed.