Friday, 7 August 2015

Matlab's repmat in Python

This post is partly to remind myself of this, partly for others who might find themselves in a similar situation. I find myself wanting to use a repmat function in Python - used as I am to Matlab's syntax. A quick search will tell you that the function to use is numpy.tile, and that's indeed true.

But, say that you have an image of shape (1024,512) and you'd like it to become (1024,512,3). What's the way to go? In Matlab you would just say

B = repmat(A,[1,1,3]);

and if you try that with np.tile you'll find that B has shape (1,1024,1536). Somehow, it's added a new dimension at the beginning, not at the end!

To obtain the same behaviour as Matlab, you have to do

B = np.tile(A.reshape(1024,512,1), (1,1,3))

In practice you're explicitly adding a dimension to the array A before tiling it.

Having said that, numpy does something that Matlab doesn't: it broadcasts operators. So if you have two arrays of different dimensions, numpy can automatically repeat the same operation (a bit like Matlab's bsxfun). The catch is that the operation is repeated over the first dimension of the array. So let's say we want to add A of shape (1024,512) and B of shape (1024,512,3). The way I'd go about doing that is with transposes:

C = A.T + B.T
C = C.T

Transposition in numpy is more generic than in Matlab and can be applied to n-dimensional arrays. It changes the dimensions so that that an array of size w,x,y,z its transpose will have size z,y,x,w. So that's now adding matrices of shapes (512,1024) and (3,512,1024), which works because the broadcasting happens along the first dimension. Then, of course, the result needs to be transposed back.

Wednesday, 29 July 2015

Compiling OpenCV on Yosemite

I found myself wanting to use some OpenCV features from Python. But I prefer to use Python 3, which made the Homebrew distribution of OpenCV almost useless - as the version coming with Homebrew doesn't support it. The only option was then compiling OpenCV 3.0.0. Here's how I went about doing it.

For the following, I'm assuming you're using Python 3.4.3 coming with Homebrew. Note that future updates might break the compilation, so proceed with care. Also, the OpenCV distribution that comes with Homebrew mustn't be installed or you'll have problem linking this one later.

The first step is downloading OpenCV 3.0.0 - using git works just as well as downloading the zip file.

git https://github.com/Itseez/opencv.git
cd opencv
mkdir release
cd release

Now I'm assuming you'd like to keep your installation tidy, and the best way is to let Homebrew handle this. The next command will let you configure the compilation, and to do so you will have to set some variables. The command to launch is

ccmake ..

You need to fill in the following variables:

CMAKE_INSTALL_PREFIX/usr/local/Cellar/opencv/3.0.0
PYTHON2_EXECUTABLE/usr/local/bin/python3.4
PYTHON3_EXECUTABLE/usr/local/bin/python3.4
PYTHON3_INCLUDE_DIR/usr/local/Frameworks/ Python.framework/Versions/3.4/ include/python3.4m
PYTHON3_LIBRARY/usr/local/Frameworks/ Python.framework/Versions/3.4/ lib/libpython3.4.dylib
PYTHON3_NUMPY_INCLUDE_DIRS/usr/local/lib/python3.4/site-packages/numpy/core/include
PYTHON3_PACKAGES_PATH/usr/local/lib/python3.4/site-packages

Now, CMAKE_INSTALL_PREFIX is what will later allow Homebrew to handle your installation of OpenCv. Also, you might notice that PYTHON2_EXECUTABLE is actually set to Python 3. For reasons beyond me, without that it doesn't look the Python bindings are compiled correctly so, well, whatever works.

At this point, you can check that cmake is happy with what you've done with Python. To do so, first of all, save your configuration by pressing <C> from the keyboard, and then <Q>. Then, just type

cmake ..

and you should see the following:

--   Python 2:
--     Interpreter:                 /usr/local/bin/python3.4 (ver 3.4.3)
--
--   Python 3:
--     Interpreter:                 /usr/local/bin/python3 (ver 3.4.3)
--     Libraries:                   /usr/local/Frameworks/Python.framework/Versions/3.4/lib/libpython3.4.dylib (ver 3.4.3)
--     numpy:                       /usr/local/lib/python3.4/site-packages/numpy/core/include (ver 1.9.2)
--     packages path:               /usr/local/lib/python3.4/site-packages
--
--   Python (for build):            /usr/local/bin/python3.4

This way you should be confident that the compilation process will generate all your Python bindings. And now we go with the usual

make
make install

and of course, to make sure that Homebrew will take care of your installation,

brew link opencv

If you want to make sure that the Python binding has been compiled, you should find the file lib/python3/cv2.so inside your release directory.

And finally, one little catch. Keep in mind that this cv2.so will end up inside your python site-packages, i.e., outside the reach of Homebrew. An alternative might be to set PYTHON3_PACKAGES_PATH somewhere inside the Cellar, and then add it to your PYTHONPATH variable, but I don't know if that will work or not.

Friday, 7 February 2014

A rather impressive demo

I'm not sure what would push anyone to do this, but hats off. Here's a Super Mario Bros demo that runs in Matlab. I haven't really had the time to skim through the code, but I'm impressed with the effort it must have taken to get this to work at 60 fps.

Tuesday, 3 September 2013

Undocumented Matlab

I'm not writing here nearly as often as I'd like, but as the theme of the latest few posts has been Matlab I thought it would be appropriate to continue in the same fashion.

I came across a very interesting blog about undocumented Matlab functions. It has all sort of tips and tricks: for example, did you know you can have Matlab to run using a more recent JVM? No? Me neither. (But there are very few reasons why anyone would want to do it.) Or, did you know you can force Matlab to use a new (still beta) version of the handle graphics (i.e., the component that draws the plots)? No? I had no idea either. This could be useful as the new graphics look a lot better than the usual ones.

Well, I've given you the source of all that, so dive in and enjoy...

Tuesday, 29 January 2013

Functional programming in Matlab

I came across a very interesting blog post recently. I'll just link it here so you can read it at the source. But to give you a cliffhanger, how would you write a min_and_max function in Matlab, that does this:

[extrema, indices] = min_and_max(y);

You could write an m-file but the alternative is the following mind-blowing and very clever one-liner:

min_and_max = @(x) cellfun(@(f) f(x), {@min, @max});

I love it. Check out the original blog post on Loren on the Art of Matlab for an explanation on how it works.

Thursday, 24 January 2013

Homebrew

I haven't used Linux for a long time now. I remember with mixed feelings compiling programs: partially, the satisfaction of having a piece of software exactly tuned for your machine, and partially the pain of stashing up files somewhere in the system without being exactly aware of how to get rid of them later on.

Fast forward 10 years (did I say 10? Great Scott!), I'm sitting here at my shiny MacBook Pro and thinking "Heck, I'd really need this or that program/library/whatnot".

Now, would I be happy to pile up files randomly in OS X? Not exactly. Would I be happy to create a ~/local folder where to save the binaries? Good luck with that if you need to link to a library. Here's where tools like fink or MacPorts come in handy. However, for a reason or another, neither of those never quite hit the spot for me.

The one that did hit the spot was Homebrew. I'm not exactly sure why. Probably because it's quite easy to use (not that the others aren't) but allows you to customize quite a lot of your installation (brew install --interactive package). Also, it's very easy to compile a package that isn't included in the distribution, but registering it within Homebrew so that it's easy to remove or update later on. Once you do that, you can also contribute to Homebrew itself if you're so inclined, that way you'll save others from going through the same pain you've suffered (I haven't gone that far yet, possibly because what I compiled so far required a few tricks that would take me too much work to recreate in an automated script).

So, here are a couple of tricks I learnt that weren't obvious from the documentation. Say you want to compile some package that comes with Homebrew, but for some reason you desperately want it to be tuned for your CPU, or you want to compile it with/without a certain feature which Homebrew doesn't/does include by default. The way to go is:

brew install --interactive package

At this point you're on your own to go through the usual configure; make; make install routine. Something that helps you in this is brew diy, which can be used to tell configure where to send your compiled binaries (it sets the --prefix option to configure). The command would be something like this:

./configure $(brew diy)

although of course you have to add your own options to that.

The other very useful trick is compiling packages that aren't included in Homebrew, registering them in Homebrew so that you can easily delete them later on. Say you want to install painfulcompilation 1.0. You decompressed the .tar.gz and now your terminal is on the folder painfulcompilation-1.0. This is what you need to do:

./configure $(brew diy)
make
make install
brew link painfulcompilation

and magically, painfulcompilation is installed in what is know as the Cellar, and all the binaries are linked to /usr/local/. How cool is that?

Naturally not everything is perfect. Something quite crucial is to make sure that your user has permissions to write in /usr/local otherwise you're going to have a bad time (although fortunately this type of error is quite easy to track down).

The good thing in all this is that it allows for all the dirtiest tricks that get something compiled. For example, I was trying to install the pfstools on my laptop (they're a set of tools to operate on HDR images, if you're interested). Funny thing, clang would compile most of them but then would fail in one of the tone mapping operators (Fattal's if you're interested) because it doesn't support OpenMP. On the other hand, llvm-gcc supports OpenMP but would give an error on another tone mapping operator. So I used two compilers to compile the whole package, by calling ./configure a second time after I got the first error. So that is the kind of stuff that you can get away with using homebrew.

Tuesday, 18 December 2012

Have you seen the pictures from Mars?

No? Ah, that's a shame. Here you go. First link, NASA photojournal. The photos there have already been processed and stitched, and I find the panoramas astonishing. It's just unbelievable that those pictures come from another world.

Second link, raw images. You probably want to select pictures from Mastcam or MAHLI if you want to see anything of interest. In the Mastcam photos you'll find also a few images taken in the near-infrared spectrum, if you're into multispectral imaging at all (here is a detailed description of the Mastcam, including the spectral sensitivity of the sensor and the spectral transmission of each filter).

For me, the drawback of the raw images is that for most of them only the visible-light high resolution version is available, while there's only a tiny thumbnail for the near-infrared. The reason for this is bandwidth. Yes, it doesn't take a lot of bandwidth to download a photo. But Curiosity is on Mars, and the interplanetary bandwidth is very limited. On top of this, the same antenna that communicates with Curiosity is used for all the other probes as well (or at least some of them, I'm not sure now and can't be bothered to google it). So, limited bandwidth and limited time slot too. The result is that at NASA engineers download only thumbnails of images, and then download the higher resolution version of those they deem worth the effort.

It's a shame, because I'd be very interested in putting my hands on some misty image of the horizon, where the near-infrared version is crisp and clear...