July 4, 2018

Thermal Testing

With the various layout errors on the last set of boards fixed, the controller seems to work as expected, so I've been doing some thermal testing.

I strapped the controller to a much bigger motor:  Even with the old version of the controller, the small motors I've been using would catch on fire before the controller does. 

Here it is attached to a T-motor U12:


The test setup, with the motor and controller in view of Bayley's thermal camera


For the first tests, I put the test current on the d-axis, and 2A on the q-axis just to get the motor spinning. 

Originally, I had a 0.5mm layer of thermal pad between the board and an aluminum heatspreader and aluminum motor mount beneath.

At 15 amps, basically nothing happened.  I made the passives for the buck converter and the 3.3V linear regulator much smaller on this board, so it was good to see that they're doing fine.


At 20 A you could start to see the FETS getting warm. 


And here's 30A.  I only let this one run for around 30 seconds.  The FETs in the middle of the board are noticeably hotter than those by the edges, which I partially blame on poor contact with the thermal pad.  Since the screws are at the edges of the board, the board flexes when they are tightened, and all the clamping pressure is at the corners.  Also interesting, it looks like the trace from the source of the FET in the bottom right corner is getting pretty warm.  The hole for the mounting screw makes the phase polygon pretty narrow there.  I should be able to fix that by adding a row of vias from the source legs to the internal phase layer (which I should have added before).  These boards only have 1 oz copper on top and bottom, and 1.5 oz internal layers.  


And the plots.  With this setup, I wouldn't be comfortable running the controller at 30A continuous.


I switched out the thermal pad for a layer of kapton tape for electrical insulation, and some thermal paste.  Contact in the middle was still not great, but overall it was a lot better than the thermal pad.


Performance was greatly improved, but there's still a ~7 degree C difference between the hottest and coldest FETs.  Furthermore, in the middle of the board, the low-side FETs are several degrees hotter than the high-side FETs - traces  Sp3 and Sp2 respectively on the plot below:


It looks like with this setup, the controller can handle around 30A continuously.  I'm pretty satisfied with that, but I think the performance would be substantially improved by adding an aluminum cover over the motor controller which firmly pressed against the row of FETS, to get good contact on the bottom of the board.  The cover would also serve as a top-side heatsink, which is surprisingly effective with this FET package, as there's only ~0.3mm of plastic above the copper clip on top of the die (I sanded one down to measure).

I've been using these 3.3 mOhm fets, which I like because they're cheap and have only 50 nC gate charge, so they don't stress the DRV chip switching at 40 kHz.  I swapped out the FETs for the 1.4 mOhm flavor, since I had a bunch of them lying around.  These have ~half the resistance, but about twice the gate charge, so twice the switching loss.  It works out slighly favorably in this case (for currents above around 20A) - I expect around 10% less dissipation per-fet at 30A.

Here's 30A on the 1.4 mOhm FETs.  Check out how much hotter the DRV chip gets, doing twice the gate drive work.  ~10 C hotter than before.  There are actually lots of better transistors available now, even for similar price, so I can get power dissipation down around 15-20% further.


To make the tests a bit more stressful, I put the motor on the dyno so I could actually get some power into and out of it.  I made a new motor mounting plate which doesn't cover up the controller:



This time, I put all 30A on the Q-axis, and ramped up the speed until the achievable current just started to drop off (which is the maximum power point for this motor).  At 22V on the bus, I was able to put 500 watts in and get 400 back out.  I let everything cook for a while to get up to temperature.  The only noticeable difference was that the DC link wires were roasting.  The 16 AWG is not really meant for 23A DC + ripple current in continuous duty.  Going into this test, I was a little worried about how the ceramic DC link capacitors would handle the current ripple, since there's only 60 uF on the board, but they didn't get any hotter.


To see where all the temperature drop was coming from, I got a side-view of the heatspreader and motor mount assembly.  Point Sp3 is on the side of the motor mount, point Sp2 is on the side of the heatspreader, Sp1 is the hot FET, and Sp4 and 5 are the cooler FETs.  This is after running for several minutes continuously at 30 A


The heatspreader, motor mount, and the back of the motor were all within a couple degrees of each other, with all the temperature drop happening between the FETs and the heatspreader.


With an estimated power dissipation of 1 watt per FET at 30A, this gives a thermal resistance of 12 C/watt between the transistor and the heatspreader for the cooler FET, and 17 for the hot one.  

Here's a sketched cross section of the 3 different FET layouts on the board, and why I think the low-side FETs with shunts get so much warmer than the others (in addition to the poor clamping at the middle of the board).

First, the high-side FETs, which are the coolest.  Labeled stack-up on the left, main heat conduction path on the right.  The vias beneath the drain tab go straight into the big V+ plane on the bottom, which is copper, so super thermally conductive in the plane.  This effectively increases the area which conducts through to the heatspreader.


The low-side FET without the shunt, which was much cooler than the other two, looks like this:  It has a bunch of vias from the source pins straight into the ground plane, in addition to the vias beneath the drain tab.  The source pins are part of a monolithic copper sheet ("clip") which sandwiches the top of the die, so they can very effectively sink heat away from the die.

And finally the low-side FETs with shunts, which get kind of screwed, with no big plane beneath them, and a shunt attached to the source pins:



One way to help out those FETs might be to add some extra heat-sinking planes on the bottom of the board, like this:


But more importantly than trying to tweak the board layout to slightly improve the thermal performance, where is all the thermal resistance coming from in the first place?

According to the FET datasheet, the package to base thermal resistance is around 1.2 C/W.  

I got in touch with PCBWay, and their via plating thickness is 18-22 microns, independent of the copper plating thickness. According to the Via calculator, the resistance of the patch of drain vias should be 4.6 C/W.

There's also a layer of 1 mil Kapton tape, electrically insulating the heatspreader.   But the total thickness of the tape, including the adhesive, is 3 mil, or 75 microns.  The 5x5m mpatch under each FET has the thermal resistance of around 6.5 C/W.  

Ignoring the thermal paste, that's up to 12.3 C/W, which is right in line with the results from the thermal camera.

So the Kapton is the biggest culprit here.  I'd like to try using a hard-anodized aluminum heatspreader with no additional electrical insulation. The hard anodized coating is only 0.5-1 W/m*k, but it can be very thin. 

The next biggest resistance source is the vias.  The PCBWay via plating thickness corresponds to a little over 1/2 oz, but I could probably do better with some other PCB manufacturer.

Also worth noting, all these tests were with no airflow over the controller.  Pointing a good blower at the back dropped the temperatures by around 20C.

Next hardware steps:  wait for AMS position sensors and DRV8323RS chips to come back in stock, design a FET-clamping cover, and experiment a bit more with heatsinking.  


June 16, 2018

Integrated Motor Controllers V2: Searching for the noise

Remember how I said I was going to re-do the integrated motor controller using the new TI DRV8323 chip?  Well, I did that.  First round of boards came in and had a few errors, but I think I've found them all. 

Changes:

  • DRV8323RS for gate drive, buck converter, and current shunt amplifiers
  • Smaller footprint - 37x37mm
  • Reasonable shape!  Square mounting hole pattern, centered position sensor
  • All components (except for the position sensor) on the top side - the underside is flat for easy mounting and heat-sinking
  • Designed for through-board cooling, with FET pads thermal-via'd to the underside
  • True 4-layer design, 0402 package passives wherever possible, 6 mil trace/spacing, to make it small
  • Smaller buck passive, smaller CAN transceiver

Front:



Back side.  Notice how empty it is.


Attached to a motor.  Unfortunately I forgot to take a picture of the board before doing a lot of reworking on it, so it's kind of ugly now:


After correcting a couple easy-to-fix mistakes with bodge wires (forgot to connect the CAN transceiver standby pin to ground, routed one of the wrong current sense amplifier outputs to the A/D, since I'm only using 2 out of the 3), the controller worked (as in, it was able to spin a motor), but it was clear immediately that something was wrong with the current sensing.  There was tons of audible noise once the current loop was closed, so I took a look at the measured phase currents.  Here's a plot of the output from the 2 A/Ds, while setting a constant d-axis voltage and slowly rotating the motor by hand:


The first current measurement looks beautiful, but the second one is absolute garbage.  The big spikes are ~750 A/D counts, which would correspond to 15 amps.  Time to break out the scope:

Here's the output of the amplifier:  The big spikes are at around 8.7 kHz, but there's also a lot of higher-frequency noise:


Zooming in:  There are also smaller pulses at 675 kHz.  That's almost certainly the switching frequency of the buck converter on the DRV chip:


Next step was to see if this noise actually showed up on the input to the amplifier, or whether it was showing up after the amplifier or inside the chip somehow.

Test points attached:


I measured in a few places.  First, directly across the shunt.  As expected the noise didn't show up directly across the shunt - unless there were actually many amps of noise flowing through the 1 milli-ohm resistor, there shouldn't be any volts across it.  

I moved the test points ot the two vias directly before the input to the DRV:


And the noise appeared.  For these measurements, the Rigol didn't really have the resolution, so I broke out the Yokagawa DL708E, which has 8 isolated channels with 12-bits of resolution (4 bits more than the Rigol).


After two evenings of probing around and making very little progress, I finally found the two layout errors that combined to cause this problem.

Here's the board with ground highlighted.  The logic ground up top is on an inner layer, and the power ground below is on both the bottom plus an inner layer.  Take a look at the border between the two:


The three highlighted traces crossing the border are the traces for sensing across the current shunts (and from the sources of the un-shunted FET).  Yep, I accidentally forgot to connect the logic ground to the power ground.

Which brings up the question:  How did the board even turn on at all?  The traces from the shunts are inputs to amplifiers, and should be high impedance.  It shouldn't be possible to power all the logic through them.  And why didn't Eagle yell at me when I ran DRC, or show any air wires?

Taking a closer look at the noisy shunt, this time with the inner layers visible:


Oh.  Oops.  When I dropped in the vias for the shunt, the ones on the ground-side of the shunt got automatically connected to the internal ground plane.  Which is also why the logic ground plane didn't show up as being disconnected from the power ground plane in the DRC.  So all the power for all the logic was passing from the power ground, through the current sensing trace, through the vias, to the logic ground.  That'll do it.  

At least I've learned a good lesson from this:  Always add kelvin connections to the shunt footprint in Eagle, even if the actual shunt resistor doesn't have them.   This way, the current sensing traces won't be part of the ground net, and this kind of mistake would be impossible.

New boards are on the way, so hopefully I'll be able to do some stress-testing in the next few weeks.  On the firmware side, I plan on finishing all my autocalibration stuff soon.  Then I'll easily be able to slap this controller on all my small motors, and collect a huge pile of motor data.

May 27, 2018

Big Dyno Beginnings

Bayley and I have been thinking about building a bigger motor dyno for a while now, based off the same power system as the Big Kart.  Over the last year we've been slowly collecting parts for it - a torque sensor, a couple more Sonata HSGs, Prius inverters, etc, and I've finally started putting together the mechanical side of things.

Whereas the small dyno is good for 10 N-m of torque and 6000 RPM, this one will be good for 60 N-m and 15,000 RPM (although not simultaneously), and somewhere around 30 kW power.

First step was acquiring the appropriate torque sensor.  The DC-output one I have on the small dyno is extremely convenient, because you can just measure the output with a DAQ or A/D, but the AC coupled ones are much more common on Ebay.  Unfortunately, the readouts for the AC torque sensors are very expensive and hard to come by.  A pile of really big, really cheap DC-operated torque sensors showed up on ebay for $28 apiece, so we got one and the MIT FSAE team collected the rest of the lot.

10,000 inch-pounds! (~1100 N-m) and 8500 RPM.  With ~2" shafts on either end.  Not actually very useful.


We gutted the readout electronics from it:


And shoved them into this AC-operated 500 in-lbs, 15k RPM torque sensor.  This thing is pretty much perfectly matched to the Sonata HSGs.


Several months ago I went on a shaft-coupling machining spree, making the two lovejoy couplings for the torque sensor, and two more for a pair of the HSGs:





The shaft couplings all have internal keyways, which I manually shaped on the lathe.  I ground HSS inserts for a boring bar, locked the lathe spindle, and shaped the keyways with the carriage feed, retracting the cross slide ~.005" per stroke, with occasional spring passes.  The keyways can be easily widened by shifting the tool up a little on the toolpost.


I've been pleasantly surprised by how nicely the keyways turn out with this process.  Even for some big 6mm keyways for pulley hubs for the Big Kart, the results were very clean.




The shape of the Sonata HSGs leaves much do be desired.  While at least they are completely sealed and have real shafts, unlike the motors in most hybrid transmissions (i.e. the Prius or Fusion motors, which are tightly integrated into the powertrain, and have to be freshly rehoused to be used at all), they are very strangely shaped.  Their cast aluminum housings have almost no flat or square surfaces, making them very challenging to measure.  I was able to eventually find the positions of all three mounting feet and the shaft by using a surface plate, a height gauge, a set of adjustable parallels, some custom-turned shafts which press-fit into the bolt holes, and one sacrificial motor housing which was bandsawed in half.  The last measurements, which located the motor shaft with respect to the feet, could not be reasonably made without referencing some of the machined bores on the inside of the housing.


I machined a pair of HSG mounts.  The feet were turned on MITERS's  newly acquired Hardinge HLV lathe, and the base was CNC milled on the Super Mini Mill.  The turned feet press-fit into locating pockets on the base, and are fastened from underneath.  The two diagonal feet have 14mm bosses on top, which mate with corresponding 14mm locating bores in the bottom HSG casting.


The underside:  Since the parts needed to get flipped over anyways, I did some gratuitous pocketing on the bottom:


The painful measurement process paid off.  The motors dropped into place perfectly:


Like the small dyno, the base of the big dyno is a huge slab of aluminum.  I started out with a big optical breadboard I got for free last fall.  It's a roughly 1'x3'x1" cast aluminum sheet, with a 2" grid of 1/4-20 holes in it.  Somehow this plate was extremely warped, with roughly a 1.5mm height difference between the center and the edges, in the long direction.  I set up the plate on the bridgeport, and faced the hole thing:

The plate was substantially longer than the x-axis travel of the mill, so I had to swivel the column around half way through to face the whole thing.


To fixture it, I machined a little step into each end.  This way the step clamps hold it down hold right at the edges, and don't put any bending stress into the plate:



Here you can see the huge gap between the center of the plate and the table of the mill:


I was expecting the plate to continue warping as I machined it from residual stress in the material, but by some miracle it didn't.  Once I faced the first side, it just stayed perfectly flat.  I didn't even bother facing the underside, because nothing will be mounted to it.

Pile of chips post-facing:



I squared up all the edges as well, and gave the corners a healthy chamfer. 

Here's the absorber mounted:


And a preview of the whole assembly.  I still need to make the foot to space the torque sensor up appropriately.  The first motor we plan on dyno-ing on this thing is the HSG, so we can squeeze every last drop of performance out of the Big Kart, but after that , the motor on the right will be swappable for any other motor which fits.  The plate is mounted to the top of a cart by some rubber shock mounts to vibration isolate it.


In related news, the Big Kart is frighteningly fast now, thanks to the double-motors, Polychain-based drivetrain, and new lookup tables based on the stall data and better inductance measurements.  Hopefully we'll get some decent videos of it, now that the weather's shaping up.  Here it is after some thrashing by Michael