Don’t Flake on Your Fish—Feed them Automatically

We get it. You love your fish, but they can’t bark or gently nip at your shin flesh to let you know they’re hungry. (And they always kind of look hungry, don’t they?) One day bleeds into the next, and you find yourself wondering if you’ve fed them yet today. Or are you thinking of yesterday? Fish deserve better than that. Why not build them a smart fish feeder?

Domovoy is a completely open-source automatic fish feeder that lets you feed them on a schedule, over Bluetooth, or manually. This simple yet elegant design uses a small stepper motor to drive a 3D-printed auger to deliver the goods. Just open the lid, fill ‘er up with flakes, and program up to four feedings per day through the 3-button and LCD interface. You can even set the dosage, which is measured in complete revolutions of the auger.

It’s built around an ATMega328P, but you’ll have to spin your own board and put the feeder together using his excellent instructions. Hungry to see this feeder in action? Just swim past the break.

Can’t be bothered to feed your fish automatically? Train them to feed themselves.

Posted in atmega328p, bluetooth, fish, fish feeder, how-to, Microcontrollers, rtc | Leave a comment

An Ultrasound Driver With Open Source FPGAs

Ultrasound imaging has been around for decades, but Open Source ultrasound has not. While there are a ton of projects out there attempting to create open ultrasound devices, most of this is concentrated on the image-processing side of things, and not the exceptionally difficult problem of pinging a sensor at millions of times a second, listening for the echo, and running that through a very high speed ADC.

For his entry into the Hackaday Prize, [kelu124] is doing just that. He’s building an ultrasound board that’s built around Open Hardware, a fancy Open Source FPGA, and a lot of very difficult signal processing. It also uses some Rick and Morty references, so you know this is going to be popular with the Internet peanut gallery.

The design of the ultrasound system is based around an iCE40 FPGA, the only FPGA with an Open Source toolchain. Along with this, there are a ton of ADCs, a DAC, pulsers, and a high voltage section to drive the off-the-shelf ultrasound head. If you’re wondering how this ultrasound board interfaces with the outside world, there’s a header for a Raspberry Pi on there, too, so this project has the requisite amount of blog cred.

Already, [kelu] has a working ultrasound device capable of sending pulses out of its head and receiving the echo. Right now it’s just a few pulses, but this is a significant step towards a real, working ultrasound machine built around a reasonably Open Source toolchain that doesn’t cost several arms and legs.

Posted in 2018 Hackaday Prize, iCE40, The Hackaday Prize, ultrasound | Leave a comment

Biasing That Transistor: The Emitter Follower

We were musing upon the relative paucity of education with respect to the fundamentals of electronic circuitry with discrete semiconductors, so we thought we’d do something about it. So far we’ve taken a look at the basics of transistor biasing through the common emitter amplifier, then introduced a less common configuration, the common base amplifier. There is a third transistor amplifier configuration, as you might expect for a device that has three terminals: the so-called Common Collector amplifier. You might also know this configuration as the Emitter Follower. It’s called a “follower” because it tracks the input voltage, offering increased current capability and significantly lower output impedance.

The emitter follower circuitThe emitter follower circuit

Just as the common emitter amplifier and common base amplifier each tied those respective transistor terminals to a fixed potential and used the other two terminals as amplifier input and output, so does the common collector circuit. The base forms the input and its bias circuit is identical to that of the common emitter amplifier, but the rest of the circuit differs in that the collector is tied to the positive rail, the emitter forms the output, and there is a load resistor to ground in the emitter circuit.

As with both of the other configurations, the bias is set such that the transistor is turned on and passing a constant current that keeps it in its region of an almost linear relationship between small base current changes and larger collector current changes. With variation of the incoming signal and thus the  base current there is a corresponding change in the collector current dictated by the transistor’s gain, and thus an output voltage is generated across the emitter resistor. Unlike the common emitter amplifier this voltage increases or decreases in step with the input voltage, so the emitter follower is not an inverting amplifier.

The keen-eyed reader will have noticed at this point that since the base-emitter junction of a transistor is also a diode, it will always maintain approximately the same voltage across itself regardless of the current flowing within it. For a silicon transistor, this is around 0.6 V so the output voltage on the emitter will always be 0.6 V lower than the input voltage on the base. Thus the voltage gain of an emitter follower will always be just a tad less than 1, and you might thus expect that it would therefore be of little use as an amplifier if you neglected that it has significant current gain. The output impedance of an emitter follower is significantly lower than that of a common emitter amplifier, allowing it to drive much more demanding loads. You will often find it used as a buffer stage for this reason, and a handy example can be found on the output of an early op-amp we took a look at earlier in the year.

We’ve now taken a look at the three basic configurations of a transistor amplifier, as well as the fundamentals of biasing a bipolar transistor. It might seem odd to cover this topic on Hackaday when it’s certain that many of you are already familiar with it, but sometimes it’s worth remembering that not everybody is fortunate enough to he well-versed in these fundamentals. The impetus for this series came from a friend lamenting that while his pupils had advanced knowledge of microcontrollers that his generation hadn’t acquired as their age, they had not been given the opportunity to learn these fundamentals.

There is one final piece to come on this topic, these same principles apply to the other three-terminal active components, so we’ll have a quick look at FETs and tubes.

Posted in amplifier, common collector, emitter follower, follower, Hackaday Columns, how-to, transistor, transistor biasing | Leave a comment

Hacking for Learning and Laughs: The Makers of Oakwood School

The tagline of Bay Area Maker Faire is “Inspire the Future” and there was plenty of inspiration for our future generation. We have exhibits encouraging children to get hands-on making projects to call their own, and we have many schools exhibiting their student projects telling stories of what they’ve done. Then we have exhibitors like Oakwood School STEAM Council who have earned a little extra recognition for masterfully accomplishing both simultaneously.

[Marcos Arias], chair of the council, explained that each exhibit on display have two layers. Casual booth visitors will see inviting hands-on activities designed to delight kids. Less obvious is that each of these experiences are a culmination of work by Oakwood 7th to 12th grade students. Some students are present to staff activities and they were proud to talk about their work leading up to Maker Faire with any visitors who expressed interest.

In one activity, visitors build their own tippe top. Each person pulls a 3D-printed body from inventory, performs surface finishing work with sandpaper, and install a wooden dowel stem followed by optional decorations with color markers. This simple build is accessible to a wide spectrum of audiences and provides immediate satisfaction with a fun toy. But how was the tippe top’s body shape determined? They did not just download something online. The profile was generated by students working and iterating through many ideas satisfying the requirement — fits within a volume of 30 cm³ — while maximizing their evaluation metric — flips over fastest and remains spinning upright longest. Once a winning design was chosen, it was printed at quantity to star in this activity at Maker Faire.

Another activity invites visitors to build a gravity racer. Just like the tippe top activity, the design actually built by Maker Faire attendees is the winning design from Oakwood students who worked to find the best shape to meet the challenge. Builders at the faire can customize their own racer during assembly from provided parts, then two racers can compete side by side on a long track to see how well their racer worked.

Chair [Marcos Arias] steers Oakwood’s STEAM program with the North Star of “Play to Passion to Purpose.” It was fascinating to hear about the work behind these and other fun Maker Faire activities. We can rest assured that creative problem-solving hacker spirit is nurtured at such schools to inspire our hackers of the future.

Posted in bamf, Bay Area maker faire, bay area maker faire 2018, cons, Hackerspaces, maker faire, Maker Faire Bay Area, STEAM eductation | Leave a comment

Hacking When It Counts: The Pioneer Missions

If the heady early days of space exploration taught us anything, it was how much we just didn’t know. Failure after failure mounted, often dramatic and expensive and sometimes deadly. Launch vehicles exploded, satellites failed to deploy, or some widget decided to give up the ghost at a crucial time, blinding a multi-million dollar probe and ending a mission long before any useful science was done. For the United States, with a deadline to meet for manned missions to the moon, every failure in the late 1950s and early 1960s was valuable, though, at least to the extent that it taught them what not to do next time.

For the scientists planning unmanned missions, there was another, later deadline looming that presented a rare opportunity to expand our knowledge of the outer solar system, a strange and as yet unexplored wilderness with the potential to destroy anything humans could build and send there. Before investing billions in missions to take a Grand Tour of the outer planets, they needed more information. They needed to send out some Pioneers.

A Grand Tour on the Back of Orbital Mechanics

Even before the time of Kepler and Newton, the orbits of the planets had been well characterized, and astronomers were well aware that a planetary alignment would occur in the late 1970s that would allow a single spacecraft to visit each of the outer planets. After exploring Jupiter, the probe would use the gas giants’ gravity to fling it on to Saturn, then to Neptune, and then out of the Solar System. Such an alignment happens only once every 175 years, and it was too good to pass up.

But humanity hadn’t yet reached beyond the inner planets, and there were many questions. Would the asteroid belt prove an impenetrable barrier? Would Jupiter’s powerful radiation belts cook any probe that came close to it? And how would we communicate with a probe that far away? These questions and more presented unacceptable risks to a mission that would cost in the billions of dollars, and they needed to be answered before committing to a Grand Tour mission.

First proposed in 1964 as a “Galactic Jupiter Probes” and approved by NASA in 1969, what would come to be known as Pioneer 10 and Pioneer 11 would answer these questions. The idea was to build small, lightweight probes that would take advantage of favorable launch windows 13 months apart in 1972 and 1973. Each probe would carry just enough instrumentation to provide the data needed to plan the Grand Tour mission that would launch in 1977 as the Voyager program.

Pioneer 11. Source: Honeysuckle Creek Tracking Station

By space program standards, both the mission and the Pioneer spacecraft were full of hacks. Pioneer 10 would visit only Jupiter to get the critical data before heading off into deep space. Pioneer 11 would continue on to Saturn, assuming either probe survived the asteroid belt, where no probe had gone before.

The spaceframe for the probes was simple and lightweight, leveraging previous probes from the Pioneer program. It consisted of a dish antenna as large as the Atlas-Centaur launch vehicle, left over from the Mercury manned program, would allow. To reduce weight and cost, the Pioneer spacecraft would be spin-stabilized, reducing the amount of maneuvering fuel it had to carry. NASA lucked out on power for the probes, using radioisotope thermal generators (RTGs) that were given to the program by the Atomic Energy Commission. The devices only provided an energy budget of about 155 watts at launch, though, so instrumentation had to be carefully selected.

The limited energy budget meant NASA had to make some tradeoffs, though, and the biggest of these was to not include a computer onboard, at least not in the traditional sense. There was a tiny amount of memory on board, about 6-kB, which was just enough to store five commands uplinked from Earth. But everything the spacecraft would need to do would be commanded from the ground over the Deep Space Network (DSN), taking into account the 90 minute round trip when the machines were at Jupiter.

Pix or it Didn’t Happen

How the Imaging Photopolarimeter (IPP) took pictures. Source: NASA History Program Office

In the same way that we tend to take heat from social media friends if we don’t post pictures from a vacation, NASA was not going to get away with not sending back pictures from across the Solar System. But cameras and the gear needed to digitize, record, and transmit the data are bulky and energy-hungry, so mission planners had to be especially clever about a solution. They decided to take advantage of the spin-stabilization of Pioneer, using the 4.8 RPM rotation as a way to optically scan Jupiter as it approached the gas giant. The Imaging Photopolarimeter instrument was put to use as a one-pixel camera, sweeping off one scan for every rotation of the craft and changing the angle of the sensor slightly after each scan. Each scan was done twice, once with a red filter and once with blue, and the data was sent to the ground for extensive manipulation, including adding a false green channel.

Up close and personal with Jupiter. Source: NASA History Program Office

Despite the low-budget nature of the design, Pioneer managed to pack a lot of science on board, from cosmic ray detectors, to magnetometers, to the crucial micrometeoroid detectors used to analyze the risks of crossing the debris-strewn asteroid belt. That last instrument was a simple affair as well, with an array of cells pressurized with gas; any impact strong enough to burst a cell could be detected by the loss of pressure.

Pioneer 10 launched on March 2, 1972; Pioneer 11 followed 13 months later. After a 20-month journey, Pioneer 10, having survived what turned out to be the not-so-dangerous asteroid belt, arrived at Jupiter. It survived the onslaught of Jovian radiation, sending back data and pictures as it gained speed in Jupiter’s gravity well. The spacecraft performed flawlessly, flinging itself around the planet and providing humanity’s first look at a crescent outer planet as it achieved solar escape velocity and became the first interstellar spacecraft.

Aware of this fate, the Pioneer team had affixed the then-controversial but now-famous Pioneer Plaque to the spacecraft, bearing images of the spacecraft, figures representing the species that built it, and data to show where and when the craft was launched, in the unlikely event that it would ever be found by another spacefaring race.

Into the Void

The Pioneer Plaque. Source: NASA

The Pioneer program was immensely successful, despite — or perhaps because of — its limited scope and budget. The Voyager probes, vastly more complicated and capable spacecraft, would be launched in 1977 and visit Jupiter, Saturn, Uranus, and Neptune with stunning images and torrents of data that scientists are still poring through, piecing together a picture of the outer Solar System before joining the Pioneers in interstellar space.

Both Pioneers long ago went silent, their RTGs finally having degraded to uselessness. Voyager still lives on, but what it accomplished was only because Pioneer paved the way.

Many thanks to [J.R. Dahlman] for the inspiration for this article, and the suggestion to read The Depths of Space: The Story of the Pioneer Planetary Probes by Mark Wolverton.

Posted in deep space, Engineering, Featured, hacking when it counts, history, Jupiter, nasa, Original Art, Pioneer, space, voyager | Leave a comment

Animated Bluetooth Bike Turn Signals

Tired of risking his life every time he had to signal a turn using his hands while riding his bicycle in rainy Vancouver, [Simon Wong] decided he needed something a bit higher tech. But rather than buy something off the shelf, he decided to make it into his first serious Arduino project. Given the final results and the laundry list of features, we’d say he really knocked this one out of the park. If this is him getting started, we’re very keen to see where he goes from here.

So what makes these turn signals so special? Well for one, he wanted to make it so nobody would try to steal his setup. He wanted the main signal to be easily removable so he could take it inside, and the controls to be so well-integrated into the bike that they wouldn’t be obvious. In the end he managed to stuff a battery pack, Arduino Nano, and an HC-05 module inside the handlebars; with just a switch protruding from the very end to hint that everything wasn’t stock.

On the other side, a ATMEGA328P microcontroller along with another HC-05 drives two 8×8 LED matrices with MAX7219 controllers. Everything is powered by a 18650 lithium-ion battery with a 134N3P module to bring it up to 5 VDC. To make the device easily removable, as well as keep the elements out, all the hardware is enclosed in a commercial waterproof case. As a final touch, [Simon] added a Qi wireless charging receiver to the mix so he could just pull the signal off and drop it on a charging pad without needing to open it up.

It’s been some time since we’ve seen a bike turn signal build, so it’s nice to see one done with a bit more modern hardware. But the real question: will he be donning a lighted helmet for added safety?

Posted in Arduino Hacks, arduino nano, bicycle, hc-05, led hacks, QI, transportation hacks, turn signal, weatherproof | Leave a comment

Programmable Air Makes Robotics A Breeze

[Amitabh] was frustrated by the lack of options for controlling air pressure in soft robotics. The most promising initiative, Pneuduino, seemed to be this close to a Shenzhen production run, but the creators have gone radio silent. Faced with only expensive alternatives, he decided to take one for Team Hacker and created Programmable Air, a modular system for inflatable and vacuum-based robotics.

The idea is to build the cheapest, most hacker-friendly system he can by evaluating and experimenting with all sorts of off-the-shelf pumps, sensors, and valves. From the looks of it, he’s pretty much got it dialed in. Programmable Air is based around $9 medical-grade booster pumps that are as good at making vacuums as they are at providing pressurization. The main board has two pumps, and it looks like one is set to vacuum and the other to spew air. There’s an Arduino Nano to drive them, and a momentary to control the air flow.

Programmable Air can support up to 12 valves through daughter boards that connect via right-angle header. In the future, [Amitabh] may swap these out for magnetic connections or something else that can withstand repeated use.

Blow past the break to watch Programmable Air do pick and place, control a soft gripper, and inflate a balloon. The balloon’s pressurization behavior has made [Amitabh] reconsider adding a flow meter, but so far he hasn’t found a reasonable cost per unit. Can you recommend a small flow meter that won’t break the bank? Let us know in the comments.

Posted in 2018 Hackaday Prize, air pressure, air pump, Arduino Hacks, arduino nano, pick and place, pressure sensor, soft robotics, The Hackaday Prize, vacuum | Leave a comment

VCF East: The Desktop ENIAC

The ENIAC, or Electronic Numerical Integrator and Computer, is essentially the Great Great Grandfather of whatever device you’re currently reading these words on. Developed during World War II for what would be about $7 million USD today, it was designed to calculate artillery firing tables. Once word got out about its capabilities, it was also put to work on such heady tasks as assisting with John von Neumann’s research into the hydrogen bomb. The success of ENIAC lead directly into the development of EDVAC, which adopted some of the now standard computing concepts such as binary arithmetic and the idea of stored programs. The rest, as they say, is history.

But ENIAC wasn’t just hugely expensive and successful, it was also just plain huge. While it’s somewhat difficult for the modern mind to comprehend, ENIAC was approximately 100 feet long and weighed in at a whopping 27 tons. In its final configuration in 1956, it contained about 18,000 vacuum tubes, 7,000 diodes, 70,000 resistors, 10,000 capacitors, and 6,000 switches. All that hardware comes with a mighty thirst for power: the ENIAC could easily suck down 150 kW of electricity. At the time this all seemed perfectly reasonable for a machine that could perform 5,000 instructions per second, but today an Arduino would run circles around it.

This vast discrepancy between the power and size of modern hardware versus such primordial computers was on full display at the Vintage Computer Festival East, where [Brian Stuart] demonstrated his very impressive ENIAC emulator. Like any good vintage hardware emulator, his project not only accurately recreates the capabilities of the original hardware, but attempts to give the modern operator a taste of the unique experience of operating a machine that had its heyday when “computers” were still people with slide rules.

How Low Can You Go?

Given the monstrous rift between the computational power of the ENIAC versus something as pedestrian as the Raspberry Pi, it’s natural to wonder just how much abstraction is required to emulate the hardware. We occasionally talk about “cycle accurate” emulation when dealing with older hardware: which essentially means that the emulator can run software from the original machine without the software needing to be modified. The emulator is accurate insofar as the software running on it cares to look, but it does’t mean the underlying methods are the same. This lets an emulator run the older software while using modern tricks to help improve overall performance.

But with a computer as slow as the ENIAC, speed isn’t really a concern. We’ve got plenty of power to burn, so how accurate can you get? Originally, [Brian] thought it would be interesting to simulate ENIAC on the circuitry level. But given that part count, and the fact that the documentation really only has a rough explanation of the internal circuitry, he thought that might be a tall order. In the end, he decided to simulate the ENIAC down to the actual pulses that would travel though the machine while in operation. This level of emulation makes his software exceptionally accurate, and indeed it can run any example program from the original ENIAC technical manuals, but does mean that even on modern computers the simulation can run slower than on the actual ENIAC. But the increased fidelity, especially for those who wish to truly understand how early computers like this operated, is worth waiting around for.

The ENIAC Experience

It would have been easier to create a command line emulator for the ENIAC that just dumped its results to the terminal (and indeed others have done just that), but that wouldn’t give you the feeling of actually running a computer that was large enough to take up a building. For that, [Brian] created a number of visuals that use actual images of the ENIAC panels. This gives the user the impression of actually standing in front of the computer, watching the banks merrily blink away as it works through the given program.

While it’s not required to use the emulator, [Brian] even went as far as recreating the handheld control unit the ENIAC operators would have used. He mentions this peripheral is often overlooked, and in his research was only able to find a single clear shot of what the device looked like for him to base his 3D printed model on.

ENIAC: The Home Game

[Brian] has made the source code and compiled binaries for his ENIAC simulator available for anyone who wishes to try their hand at commanding a virtual representation of the original “Big Iron”. He’s even provided binaries for machines as lowly as the C.H.I.P (if you can find one, that is) so it doesn’t take much gear to get your own mini ENIAC up and running. You’ll have to provide your own hydrogen bomb to research, though.

If you’d like a crash course on the rather alien methods of programming the beast, our very own [Al Williams] recently wrote up a fantastic piece about the ENIAC, including some information on operating it within a virtual environment.

Posted in classic hacks, computing history, cons, emulation, eniac, Hackaday Columns, retrocomputing, VCF, VCF East, VCF East XIII | Leave a comment

Hackaday Belgrade 2018 is Sold Out: We Can’t Wait for Saturday

Greetings from beautiful Belgrade! With the Hackaday crew arriving over the last couple of days, preparations are in full swing, and the excitement is building for Hackaday Belgrade 2018 on Saturday. Here’s all the news you need to know.

If you haven’t gotten tickets yet, you can’t say we didn’t warn you! We’ve sold out. But don’t despair: there’s a waitlist, so get your name in now if you still want to get in.

If you’re looking for something to do in town this weekend, don’t miss [Brian Benchoff]’s Ode to Belgrade and especially some great local info in the comments. From which taxis to take, to finding a hardware store, to touring monuments of brutalist architecture, this post has it all.

And last but not least, the badges are in the final stages of production.  [Voja] and [Mike] are temporarily distracted by watching themselves on N1, the Serbian CNN affiliate, for which they were interviewed this morning about hacker culture, and about building badge hardware and writing the firmware for it. They’ll get back to epoxying speakers and writing code any time now.

In short, Hackaday Belgrade is a sold-out, unstoppable force of nature. We’re so excited to be here and can’t wait to see you all on Saturday!

Posted in Belgrade, cons, Hackaday Belgrade, Hackaday Belgrade 2018, Hackaday Columns | Leave a comment

A Smarter PSU Converter Leaves the Magic Smoke Inside

Over the years, computers have become faster, but at the same time, more power hungry as well. Way back around the 386 era, most PCs were using the AT standard for power supplies. Since then, the world moved on to the now ubiquitous ATX standard. Hobbyists working on older machines will typically use these readily available supplies with basic adapters to run old machines, but [Samuel] built a better one.

Most AT to ATX adapters are basic passive units, routing the various power lines where they need to go and tying the right pin high to switch the ATX supply on. However, using these with older machines can be fraught with danger. Modern supplies are designed to deliver huge currents, over 20 A in some cases, to run modern hardware. Conversely, a motherboard from the early 90s might only need 2 or 3A. In the case of a short circuit, caused by damage or a failed component, the modern supply will deliver huge current, often damaging the board, due to the overcurrent limit being set so high.

[Samuel]’s solution is to lean on modern electronics to build an ATX to AT adapter with programmable current protection. This allows the current limit to be set far lower in order to protect delicate boards. The board can be set up in both a “fast blow” and a “slow blow” mode to suit various working conditions, and [Samuel] reports that with alternative cabling, it can also be used to power up other old hardware such as Macintosh or Amiga boards. The board is even packed with extra useful features like circuitry to generate the sometimes-needed -5V rail. It’s all programmed through DIP switches and even has an OLED display for feedback.

It’s an adapter that could save some rare old hardware that’s simply irreplaceable, and for that reason alone, we think it’s a highly important build. We’ve talked about appropriate fusing and current limiting before, too – namely, with LED strips. 

Posted in AT, atx, classic hacks, pc, pc at, power supply, psu, retrocomputing, vintage hardware | Leave a comment