Right, the “potato, potato” thing probably doesn’t work that well in writing, but you get the point. Or at least, you soon will. I got my hands on an old Sigma ‘Super Wide’ 24/2.8 a couple of weeks ago. I came across it and just couldn’t resist; a wide-angle prime is a convenient thing to have, after all. Upon receiving it, I immediately tried it out and…it didn’t work properly. Drat. Well, it did on my old Canon EOS 50e, but it didn’t work on an EOS 7D or an EOS 30. Turns out it’s a well known-compatibility issue. Turns out also that, guess what? It can be fixed!
First, what’s the problem? Well, as said, on a Canon 50e, this lens works just fine. I ran a test roll with it and I actually really liked what I saw. It’s decently sharp, decently poor in barrel-distortion, it also performs decently in terms of flare, there’s not all that much vignetting (which I actually don’t mind much anyway) – alright, it’s autofocus is noisy as all hell, but it I often shoot manual focus anyway, so this doesn’t bother me. All considered, given what I paid for it, it’s a very decent performer. Actually, I’d like to use it a bit more. But how about using it on my favorite EOS, which is the EOS 30?
Well, everything’s just fine when used wide open. Shoot it at f/2.8 and it just happily exposes frame after frame. Identical behavior on the EOS30 film body and the digital EOS 7D. But stopped down, regardless of how far, upon pressing the shutter button the camera locks up. The EOS30 throws a tantrum where the battery indicator starts flashing, the LCD goes blank otherwise and the camera needs to be turned off and on again in order to start working. Oh, and the frame isn’t being exposed, either. The EOS 7D is a bit more sophisticated and reports an ‘error 99’. Internet lore has it that this is often a shutter problem, and/or lens contacts need to be cleaned, and…well, lots of information, but not much of actual use.
One more clue: try the depth of field preview and the same thing happens on both cameras. Crash, boom, lock-up, error…Well, that’s actually a useful clue, because it suggests it’s something to do with the aperture control. The aperture itself certainly works just fine; the shots on the 50e came out just fine, also stopped down.
So some more Googling, and ah…there we have it. The Sigma 24/2.8 Super Wide just isn’t fully EOS compatible. It never really was. Well, it was for a brief while, when it just came out, but it ceased to be as Canon brought out new cameras. Check out the link I just gave for pages and pages of problem reports on this very lens.
What’s the problem exactly? The EOS system and particularly the EF mount is highly automated and features extensive camera-lens communication. This extends logically to the aperture control, with the camera telling the lens for instance how far to stop down before taking an exposure: it gives a command to the lens (aperture control) and a value (how far to stop down). Actually, the set of commands is pretty expansive and some thorough reverse engineering has been done by several people. It results in interesting reads; one of the most worthwhile ones is by a Frenchman. It’s in French, but give it a read if you’re interested in the internal workings of the EF mount, or at least the information layer. The same guy also documented an Arduino-based device that acts as the host camera in an EF system.
The whole business of reverse engineering is also what companies like Sigma had to do when they started offering EF lenses. And that’s where a glitch happened, which is also the point where the potatoes come in. It turns out that the ‘aperture control’ command is not one, but actually two commands in EF-speak. Older EOS bodies used to issue command 0x12 for aperture motor control, while later ones (starting in the mid-1990s or so) use 0x13 for the same purpose, and in the same way. In fact, it appears Canon treats these commands for all intents and purposes as synonymous. Potato, potato!
When all is going as planned, the camera will tell the lens to ‘0x12’ (move the aperture motor), to which the lens would respond with ‘0x00’ (which apparently means ‘OK’), followed by the camera sending a numerical value representing ‘by how much’, to which the lens would respond ‘0x12’ (‘yeah, aperture control; got it’). Sigma in the early 1990s (or thereabouts) only knew about 0x12 as the aperture command. The newer EOS cameras telling a Sigma lens to ‘0x13 and quick a bit, pretty please’ results in deafening silence on behalf of the Sigma lens. And that apparently freaks out the camera. Even if it didn’t, it would still mean an inoperative aperture. Problem. Camera says ‘potato’, lens expects ‘potato’, miscommunication happens, before you know it, nukes get fired…err…well.
Having learned this, I figured that it might be possible to somehow intervene. What if we put an interpreter in-between the camera and the lens, so that the lens would hear ‘gimme some 0x12, please’ when the camera in fact asked ‘would you mind 0x13-ing for a bit’? Turns out, that’s a problem I didn’t even have to solve. Like with many obvious ideas, someone else already invented a wheel. And this one turns out to be a very functional one indeed. Cue Github.
Without further ado, here it is: the ‘sigmafix’ by Hector Martin, a.k.a. marcan. It’s beautifully simple, and therefore (probably) as reliable as it can be. The hardware couldn’t be any simpler; it’s a tiny 8-legged, 8-bit microcontroller (Microchip’s ATtiny13) and a single 220 Ohm resistor. That’s it, all of it. Here’s the schematic:
In plain text: the ATtiny13 microcontroller is connected to digital ground and the lens’ power supply, and has its PB1 pin connected directly to the LCLK (‘lens clock’, I presume). The 220 Ohm resistor is inserted into the DCL (‘data from camera to lens’), and the ATtiny13’s PB0 pin is connected to that signal line on the lens side.
It turns out that the EF ‘connector’ is basically an SPI (serial programming interface) with a clock line and two data lines, one for data going in each direction between the ‘client’ and the ‘server’. The connections as outlined above allow the microcontroller to monitor the clock signal as well as the incoming data from the camera towards the lens. Furthermore, the microcontroller can modify this data, which won’t hurt the camera (or the lens) because the series resistor protects the camera from a high current draw or sink of the microcontroller pulls DCL low if the camera sets it high and vice versa.
The genius of this solution is that it can remain passive for most of the time: as long as it’s not needed, the microcontroller can just do nothing at all and the camera and lens will chat with each other nicely. But when it needs to, the microcontroller can briefly intervene. The code in the ATtiny13 does this by listening for the 0x13 command being sent by the camera to the lens, and then changing one bit in that command to make it look like 0x12 to the lens. That’s all. The lens will then understand the 0x12 command it was given (which in fact was the more modern 0x13 command) and respond to it properly.
That this is possible makes perfect sense if you consider the binary representations of the 0x12 and 0x13 commands. These are hexadecimal values, and the communication over the EF mount is a binary, serial one. In other words: bits get sent through the line from camera to lens or vice versa. The binary representation of 0x12 is ‘00010010’, while 0x13 in binary is ‘00010011’. Note that only the last bit is different. Now, the EF communication protocol dictates that the ‘most significant bit’ (MSB) is sent first. This is the bit that represents the largest number, which is the first bit from the left when we write numbers out in human-readable left-to-right reading writing. So if the camera sends 0x12 to the lens, it actually sends the bits in the order as you would read them from left to right: 0, 0, 0, 1, 0, 0, 1, 0.
The smart thing about the program that marcan wrote is that it just has the ATtiny13 listening in on what is being sent by the camera to the lens, and it specifically looks for command bytes that start with the first 7 bits of 0x12 or 0x13 – which is the same pattern for both values: ‘0001001’. Since we know that the lens only ‘understands’ 0x12 and will ignore 0x13, the ATtiny listens for the first 7 bits and if those match 0x12 or 0x13, it ensures that the data line remains pulled low for the last (8th) bit as well, forcing it to be a zero. If the camera actually sends 0x12, this has no effect, but if it sends 0x13, the value is changed to 0x12. Nifty!
When sending ox13, the last bit would be a one (high voltage level), and since the ATtiny will force the data line low (0 volts), this would result in the camera being forced to sink current – theoretically a near-infinite amount. To prevent this from happening, the 220 Ohm resistor in the data line limits this current to 5V / 220 = 22.7mA (assuming 5V levels). I actually used 330 Ohms to limit this even further to ca. 15mA, but the EOS cameras apparently don’t complain if they need to momentarily sink over 20mA on an EF mount data line. Anyway, here’s the reason why that resistor was included as well.
Apparently, it’s OK for the camera that the final acknowledgement from the lens on the aperture routine reads 0x12 instead of 0x13. Until Canon decides in a next generation camera to not allow this anymore, of course – but I doubt the EF mount will live long enough to see this happen.
So, practically, how can this be implemented? First, we need the microcontroller. The ATtiny13 is an old part and is superseded by the functionally identical ATtiny13A, which we can safely substitute in this application. I used an ATtiny13A-SU, which is already pretty small, but I’d recommend getting the -SSU part, which is slightly smaller still. Size matters in this application.
Then, the firmware needs to be compiled. I didn’t even bother to rewrite or improve the existing work; I went through the listing (with my basic working knowledge of assembly language) and decided that there’s just no damn reason whatsoever to mess with something that’s already as elegant and adequate as this. Yes, I considered rewriting it in C++ so I could use the Arduino IDE and its accompanying tools, but why bother?
The only snag here is that the compiler this was written for is the ‘avra’ compiler, which itself is an open source compiler that’s mostly (probably exclusively) used on Linux. I couldn’t find a Windows distributable, but I remembered just in time I have a small Linux server running for some domotics and the 3D printer. So I installed the avra compiler onto it and compiled the assembly code from the Github page – which was literally a 5 minute job and produced no errors whatsoever.
Not feeling like messing with an assembly compiler and just looking for the pre-compiled .hex file? I’ve got you covered, you can download it here: sigmafix.hex
To flash the device, I made a little ‘experimenter board’ for the ATtiny13A, although you could jerry-rig it to an usbasp programmer in a flywire fashion just as well. Here’s the schematic of the PCB I made:
I mostly built this because I had never worked with the ATtiny13A and I wanted to program it myself and see how (if) it worked as I expected before trying the assembly from the Sigma fix project on it. Besides, it’s convenient to have a platform to flash the final code onto the microcontroller anyway. Here’s how it looks, all built, but with the microcontroller already removed (it lives inside the lens now):
The ‘board’ is just a 6-pin connector with a standard SPI pinout as used by the usbasp programmers you can buy just about anywhere. I also drilled the holes for two rows of headers that would allow the board to be plugged into a breadboard, but didn’t bother soldering them on. I fitted a single RGB led that actually houses three individual LEDs, which connect via three BSS123 small signal mosfets to three pins on the microcontroller. There’s some resistors to pull down the mosfet gates, to pull up the reset pin (not strictly necessary) and to limit the LED current. And also a pair of capacitors, 10uF and 100nF, for the obligatory power supply filtering. That’s it; as simple as can be!
I used this to do a little ‘blink test’ on the ATtiny13A using MCUDude’s MicroCore for Arduino. It worked right away, as expected, and programming the compiled HEX file for the Sigma fix project went flawlessly. Note: you want the ATtiny13 to run at 9.6MHz internal RC oscillator for correct timing. The rest of the fuse settings aren’t critical; I left the brownout level at the 2.7V that was set as default by MicroCore. Make sure to leave the external reset pin enabled and to enable SPI programming, otherwise you’re in (mild) trouble if you need to reprogram the chip for whatever reason.
Now for the hardware part in the actual lens. There are many ways to skin a cat, as they say, and it depends on the lens what will work. As it turns out, the Github project page shows a different lens from my 24/2.8. Since it’s just a small microcontroller that needs to fit in there, there’s usually some space, and a tiny SMD resistor (I used a 0603 part, which is really small indeed) can also always be fitted somewhere.
Let’s start by having a look at our patient:
The contacts of interest is formed by the group of five to the left; we can ignore PGND and VBAT for our exercise. In fact, we can also ignore DLC (data from lens to camera) since we won’t interfere with data being sent by the lens back to the camera. To access the innards, remove the black plastic retaining ring that sits around the rear element. I actually recommend removing the metal mount ring as well to make some more space.
Removing the black retaining ring makes it possible to access the little block with the electrical contacts; the contacts are linked with a piece of flat cable (the orange stuff sticking out of the connectors towards the lens element) to the rest of the electronics. I didn’t actually dig into the other electronics as it wasn’t necessary to go that far.
As it happens, the contacts are soldered to the flat cable, and these soldering contacts form a very good place to make our interface with the ATtiny. I removed the solder from the DCL contact, and luckily enough, there turned out to be about 0.1mm clearance between the flat cable contact and the mount contact lug. I could solder the little 0603 SMD 330 Ohm resistor across that tiny gap.
I also soldered four lengths of enameled wire to the contacts to make the interface with the microcontroller. Yeah, I know, I could have truncated those a bit further, but I figured I wanted some flexibility so I could easily manoeuvre the microcontroller into a convenient spot.
I chose to push the microcontroller into an empty space between one of the existing PCB’s and the lens barrel. This way, I could get it to keep out of the way of the movement of the optical elements as the lens is being focused. Since the wires are enameled, I didn’t have to worry about them shorting anything and I just pushed them in a jumble out of the way, underneath one of the metal mounting lugs for the plastic retaining ring.
Alright, it’s not super tidy, but it’s good enough for now. Besides, I had no idea if this would work, so I went for the quick & dirty approach. This is also why I took the effort to make that little ‘experimenter board’ earlier – just to verify the microcontroller I picked would do anything at all. After all, this particular application does not give any visible feedback or other means of confirmation that it actually does what it’s designed to do. The only way to find out is actually put it inside the lens, mount the lens onto the camera and pray for the best. Since the relevant contacts are buried between the lens and the camera, there’s also no practical way to hook up an oscilloscope or a logic analyzer to see what’s happening on the bus – at least not without permanently modifying the barrel of the lens.
Well, guess what? It worked! Just like that. Usually, when working on microcontroller projects, there’s a snag here and there, a bug that needs to be ironed out, etc. Not this time! It just worked out of the box. And yes, the lens now works on my other (newer) EOS cameras, and even on the digital EOS 7D!