I’ve been meaning for a while now to cycle back and work on the SSMU project again. The Hackaday Prize entry submission deadline was quickly approaching so I decided to put something together for motivation. Most of the details will be tracked over on their projects site:

The idea is still the same – integrate core functions from lab equipment over the audio frequency range and integrate with software running a circuit simulator. Instead of sticking to through hole components and an MSP430, I’m going to use a STM32L100RBT6 and a custom PCB. The current planned features are the following:

1-2* voltage sources (0-15V, 1MSps DAC)
1 current source (up to about 50mA)
2+ voltage measurements (12bits, 1MSPS ADC)
2 current measurements
2 Bode plot measurements (gain and phase up to 100kHz)
Fixed and adjustable power supply (3.3V, 5V, 0-9V)
Adjustable current limits for 0-9V rail and DAC output
Protocol analyzer (at least I2C, SPI, and UART)
LCR meter
Speaker output stage
USB and wireless (nRF24L01+) connectivity

*One voltage source is used for the current source but will be accessible when the current source is not in use

I’ve picked most of the components except for some current and voltage protection chips. The BOM is sitting somewhere around $20-25 so I’m on track to not break the bank.

Instead of measuring magnitude and phase by brute force sampling and software, I’m going to extract it with a few op amps. In the picture below, U1 attenuates the voltage at the DUT (can range up to 9-15V) and optionally provides amplification for lower amplitude signals. C2 compensates for some of the input capacitance the DUT might present. U2 and D1 form a precision rectifier and along with C1 constitute a peak detector. This relatively stable voltage can then be sampled by the microcontroller’s ADC as the magnitude of the waveform.

This value is then divided and compared with the input waveform again to produce a digital output representing when the waveform is above or below this fixed voltage. When compared with the digital output for the other Bode circuit (one at either side of the DUT) the microcontroller can determine the phase shift.




This project started out as a way to see how much useful work I could get out of a MSP430G2553. Since the closest thing I have to a personal electronics lab is a National Instruments myDAQ from college, I decided to try and replicate some of its functionality (namely the ability to act as a SMU) as well as integrating with a circuit simulator. The project goals were the following (in no particular order):

  • Low cost (< $100)
  • Through hole components when possible/feasible
  • At least 200ksps ADC
  • Sample window of at least 0.15 seconds (enough for 3 cycles of a 20Hz wave)
  • Sweepable sine wave output (20Hz – 20kHz)
  • Source and measure current (specific specs TBD)
  • Integration with Intel Galileo board running ngSpice
  • GUI on desktop and Android
  • Support both wired (USB) and wireless (WiFi) connections from Galileo

Currently I have a desktop GUI (written in Java with the libGDX framework) connected over USB to the Galileo. Based on what commands the GUI sends, the Galileo will either run a ngSpice simulation file or request new data from the MSP430 ADC. The results will be written to a file and sent back to the GUI. After some file parsing to determine how many signals were in the file and the number of samples, the GUI updates the plot as shown below.

Voltage in (red), voltage out (white) ngSpice signals plus ADC (green).

Voltage in (red), voltage out (white) ngSpice signals plus ADC (green).

Here the simulated circuit was a resistor divider (factor of 2) with a 7.1kHz sine wave input. When new ADC data is requested, the MSP430 starts stepping through a look-up table and outputs the value it reads to the DAC (TLV5618). The internal ADC is then configured to start a block of conversions and to use the DMA channel. This way results are directly copied to memory with low jitter (1-3 MCLK, or 0.0625us – 0.1875us for the 16MHz clock) and without interrupting the CPU.

In the above picture, the DAC was outputting a sine wave with 16 steps per period at about 7.1kHz. The green data points are from the ADC which was running at 62.5ksps.

Currently, the fastest sine wave the MSP430 can synthesize is 7.5kHz with 16 steps per period. I might be able to double this by using a DAC that only requires a single byte to update its output instead of two bytes like the TLV5618. I included a picture of the myDAQ’s oscilloscope reading below. It doesn’t quite agree with 7.1kHz but I’ve seen it jump a few hundred hertz based on what the timescale is.


Since there’s 512 bytes of RAM, only 30 10-bit ADC results can be stored at a time before I start to see stack overflows. Even though 8-bits would provide acceptable data, the ADC and its DMA channel always copy both bytes for each conversion.

My next steps for the project are the following:

  • Add an external SPI RAM chip to buffer more samples. The Galileo and MSP430 will trade off reading and writing to it.
  • Determine the specs required for current measurement and order the appropriate parts.
  • Provide more signal conditioning so voltages aren’t limited to 0-3.3V.
  • Add an interface to the GUI for editing ngSpice files. Currently I have a telnet window open to make changes.
  • Test GUI on Android which will mean sending the data over WiFi.
  • Test out a clock-tunable RC low-pass filter + VCO IC for lower distortion sine waves over the entire 20Hz-20kHz range.