I think they wanted that cinematic look and feel that low framerate gives you, but then it makes me puzzled why they didn't go with 24, which is the more standard film framerate and presumably what their pre-rendered cutscenes are running at.
Another theory is that maybe they were worried about certain cutscenes having an unstable framerate on lower end hardware?
It's bizarre how they don't quite understand this, because cinematics are Blizzard's specialty. In order to do proper 24 fps (or even 30fps) that's film-like they need to simulate the same kind of motion blur. To my eye, the in-game cinematics don't use motion blur, or if they do, it's not similar enough to make it look like a standard 180 degree shutter.