Once Upon a Pipe in the West: A working adapter from HDMI to Gardena

*This text was first published in German on [Golem](https://www.golem.de) on 16 August 2025: [HDMI auf Gardena: Spül mir das Lied vom Tod](https://www.golem.de/news/hdmi-auf-gardena-spuel-mir-das-lied-vom-tod-2508-198875.html).* ----- **How to transmit a Western film through a garden hose, and what a Soviet lunar probe has to do with it** Anyone who thinks that the HDMI format is the end of all connection problems has obviously never tried to send a film through a garden hose. This experiment combines DIY humour, space history and a dash of absurd engineering: an adapter from HDMI to Gardena – technically feasible, practically questionable, but guaranteed to be entertaining. ## We love adapters Wanting to convert signals from one format to another is in the nature of digital humans. It is a mechanism for adapting to an industry that has been bombarding its customers with new plugs for decades and regularly adding new levels to [the popular children's game with different shapes and matching holes](https://www.youtube.com/watch?v=cUbIkNUFs-4). Many people probably still have the accessories for this in their drawers. If you rummage around a bit, you'll uncover treasures from days gone by: disused mobile phones, power supplies, CD-ROMs, batteries with unknown charging and overall condition, USB sticks with advertising imprints and, finally, all kinds of adapters and connectors. From USB-A to micro-USB, RCA to mini jack – and in deeper sediments, you might even find a 20-pin Scart monster with an S-video connector at the other end of the cable. Pre-USB veterans still remember the backs of earlier computers, which had different sockets for the keyboard (DIN), mouse (9-pin serial interface), printer (25-pin parallel interface), external storage media (various SCSI formats), monitor (VGA, 15-pin D-Sub), joystick (also D-Sub 15-pin, but different) and so on. ## Looking through? Too easy! No wonder, then, that adapters have become part of contemporary culture and thus a popular object for jokes of all kinds. A classic example is the use of the [adapter from HDMI to the Gardena coupler system for garden hoses](https://traumshop.net/produkt/hdmi-zu-gardena/). For the opposite direction, i.e. from Gardena to HDMI, there are even [functional products](https://www.youtube.com/shorts/5OVJApHEONU) available. Let's go one step further and transmit images through a garden hose. The easy way would be to pull the hose straight, hold a screen at one end and look through it from the other end. But we want something more elegant. We find the technical solution to the problem in history. More precisely: in Soviet space history. Because there was a mission there that faced very similar transmission problems. And the solution from back then still works today – even when you replace a space radio link with gardening tools. ## The dark side of the moon The year is 1959, and we are in the midst of the Cold War. [The space race](https://en.wikipedia.org/wiki/Space_Race) between the USA and the USSR is in full swing. In the competition to be the first to reach the moon, the Soviet Union initially takes the lead: Luna 1 is the first space probe to fly close to the moon (it was supposed to crash into it, but missed), Luna 2 at least hits the moon's surface, and Luna 3 has a particularly ambitious goal: it is to deliver the first images of the far side of the moon – the area that is never visible from Earth – using two cameras. Then as now, sending an object into space is much easier than landing it intact back on Earth. At the time of the Luna 3 mission in 1959, no spacecraft had ever returned in one piece. For Luna 3, too, a return from the moon was completely out of the question at that time – the first soft landing would not be achieved until the following year, when a Vostok capsule landed intact from Earth orbit. ## The world's biggest radio dead spot The only way to obtain the images of the dark side of the moon taken by Luna 3 is therefore to transmit them by radio. Radio technology, developed in the 1890s, is not the problem here. Also the technology for live image transmission already exists. It is already being used on a large scale for television. The problem is the 73 trillion tonnes of moon that stand between the probe and the receiving stations at the time of recording: there is no place on Earth from which the probe could have been received. The only solution is to take photos and send them at a later time, after the probe has emerged from the moon's shadow. However, this presents the next technical challenge: digital photography – and with it, simple methods for capturing, storing and encoding image data – has not yet been invented. The first CCD sensor will not detect the light of day until 1969. So light-sensitive film still has to be used and chemically developed. ## Film material from the class enemy For this reason, a compact laboratory is being constructed and packed on board that can develop, fix, rinse and dry the film fully automatically. Spicy detail: the film material used comes from the class enemy. It came from intercepted US spy balloons from the [Genetrix project](https://en.wikipedia.org/wiki/Project_Genetrix), hundreds of which flew over the Eastern Bloc, rising up into the stratosphere. The film material used in the cameras on board was specially developed by the CIA to be resistant to temperature and radiation. The unexposed film surpluses from captured reconnaissance balloons are sent into space as part of the Soviet space programme. But how does the developed picture reach Earth? ## A fax via radio To do this, Soviet engineers use a radio technology known today as slow-scan television (SSTV) – in contrast to ‘fast-scan television’, i.e. normal television. SSTV allows image data to be transmitted via narrowband connections. Similar to a fax machine, an image is scanned line by line and the measured brightness value is modulated onto the radio signal. In Luna 3, the developed film is stretched in front of the screen of a cathode ray tube, on which a moving light point scans the surface of the image line by line. A photomultiplier is mounted in front of the film being scanned, which measures the brightness and converts it into an electrical signal. On Earth, these radio signals are received at two stations in Crimea and Kamchatka. There, the signal is displayed on a picture tube equipped with a particularly long-lasting phosphor screen: when the last line has been transmitted, the previous lines – and thus the entire image – are still visible. In a darkened room, the transmitted image can then be photographed from this screen. The idea for image transmission via SSTV goes back to Russian television pioneer Semyon Isidorovich Kataev, who developed a system for narrowband transmission of television images in 1934. The technology was further developed for use in amateur radio by Copthorne ‘Cop’ Macdonald, a student at the University of Kentucky, starting in 1957, and was approved for this purpose by the Federal Communications Commission (FCC) in 1968. SSTV transmissions via amateur radio use a frequency range that is also used for voice transmission. This means that image data encoded for SSTV can also be output as an audible signal. And this is precisely the trick we are now using for our HDMI Gardena adapter. ## Whistling in the hose The signal source in our setup is an HDMI output from a laptop. The film playing is ‘Once Upon a Time in the West’: it could have been any other film – but I needed the film title to use as the headline for this article. (Remark: In the original German version of this text, I used the title "Spül mir das Lied vom Tod", which is a play on words on the German movie title "Spiel mir das Lied vom Tod". "Spülen" means "to rinse" which seemed to be fitting for a garden hose.) The first component of our adapter is an HDMI grabber connected to the HDMI cable. In this case, it is a [no-name product purchased from Berrybase](https://www.berrybase.de/usb-hdmi-video-grabber-1080p), because it was one of the few suppliers that guaranteed Linux compatibility for its products. This grabber acts as an external monitor in the direction of the signal source, i.e. towards the laptop. The other end of the grabber is plugged into the USB port of a netbook. The netbook's operating system, [MX-Linux](https://mxlinux.org/), recognises the signal grabber as a webcam, so no additional drivers are required. ## Garden hoses lack bandwidth So far, we have transferred the video stream from the source to the first part of our adapter. At the source, this stream still had full HD resolution at 30 frames per second and in colour. Since garden hoses are not designed for video transmission and have less bandwidth, we have to reduce the quality for the next steps. There are various [quality levels for image transmission via SSTV](https://cartoonman.github.io/WAVECOM/wavecomhtm/sstv.htm). The highest resolution offers 512 x 256 pixels and RGB colours, but the transmission of a single image can take up to three minutes. This may be an acceptable frame rate when considered in relation to the duration of a space mission. However, it would detract significantly from the suspense of a film, even a Western by Sergio Leone, who was renowned for his long takes. If we want to maximise the frame rate instead of the image quality, this comes at the expense of resolution and colour: in the fastest mode, *Robot 8*, we transmit an image in greyscale with a resolution of 160 x 120 pixels in 8 seconds – or, in other words, at 0.125 fps. For encoding, we use the [libsstv](https://github.com/rimio/libsstv) library, which can be downloaded from Github and compiled by yourself. Although it is actually a program library for encoding SSTV, it can also run stand-alone and be called as a command from the console. Alternatively, there was also the [PySSTV](https://pypi.org/project/PySSTV/) software. However, it requires an image processing library that is no longer available in a 32-bit version. Since the netbook with its Atom processor can only run 32-bit software, this option was not viable. [A small script](https://video.golem.de/download/27755) that runs in a continuous loop does the following: 1. The programme *fswebcam* captures the current webcam image or still image from the video stream, converts it to greyscale and 160 x 120 pixels, and saves it to a file. 2. *libsstv* generates an audio file from the image file in which the image is encoded. 3. *aplay* plays the audio file generated in this way. The feature film encoded in a whistle signal can now be fed into the garden hose. ## Around the world, around the wo-orld A [talkbox](https://www.thomann.co.uk/harley_benton_talk_box.htm) is used for this task; it is actually an effects device that is often used by guitarists or keyboardists. It directs the sound of the instrument through a tube into the musician's mouth, who can then change the sound spectrum by moving their mouth. In principle, using a talkbox is like speaking or singing, except that the sound is not produced by breathing and the movement of the vocal cords, but is [introduced through the tube](https://www.youtube.com/watch?v=h_L5v9OTSxc). The result, depending on how it is played, is a robotic-sounding vocal. The talkbox was popularised in the 1970s by guitarist Peter Frampton, for example in [Show Me the Way](https://www.youtube.com/watch?v=o6xGqi5itxs). Other well-known examples are [Around the World by Daft Punk](https://www.youtube.com/watch?v=dwDns8x3Jb4) and [California Love by 2Pac](https://www.youtube.com/watch?v=J7_bMdYfSws). The talkbox is connected to an audio source (in our case, the audio output of the netbook) like a normal active speaker. The amplifier is usually integrated; the other parts are a horn driver and the aforementioned tube. A horn driver is a speaker that does not speak freely into the environment, but whose front is encapsulated in an airtight manner except for a small opening. Together with a relatively large diaphragm, a horn driver can thus build up [high alternating pressure at the opening](https://www.youtube.com/watch?v=pFEB0chiuJA). ## The horn driver pumps the signal into the tube When used in a horn loudspeaker, an acoustic horn is placed in front of this opening to amplify the signal. This design is more efficient than if the loudspeaker were to speak directly into the environment. Such horns are familiar, for example, from PA systems at festivals or in discotheques, or from loudspeakers for train station announcements. In a talkbox, a tube is attached instead of a horn, and the horn driver ultimately pumps the signal into the tube. The musician puts the other end of this tube in their mouth; they can modulate the sound in their mouth by moving their tongue, lips and jaw. The sound of the guitar or keyboard with modulated speech then comes out of the musician's mouth and can be picked up by a vocal microphone. And instead of an electric guitar, we can of course also pump other signals into the tube – such as a film. ## Charles Bronson as a comic book hero Finally, a Gardena hose coupling is attached to the other end of the hose to complete the HDMI Gardena adapter, and a standard water sprayer is attached to it. This is where the acoustic signal is emitted, clearly audible as a loud chirping sound. This signal can now be picked up by a microphone and decoded. There is a simple solution for this: the Robot 36 app (available for Android from [F-Droid](https://f-droid.org/de/packages/xdsopl.robot36/) and [Google Play Store](https://play.google.com/store/apps/details?id=xdsopl.robot36)) does exactly that on a smartphone or tablet. It uses the integrated microphone as a signal source, decodes the incoming signal in real time and displays the resulting image. Depending on the incoming signal, every eight seconds another image becomes visible. Well, maybe we promised a little too much. Instead of a film, there are only still images – and these are only in low quality. There is also no sound, but this could easily be added with a second hose and a second talkbox. ## A slideshow with 0.125 fps The biggest weakness of the setup is the low frame rate. A slideshow with one image every eight seconds – or even less at higher quality – takes the excitement out of even the best Western. One possible solution would be to break the film down into individual images that are characteristic of the respective scene. With automatically generated subtitles, this would result in a kind of comic strip – still not a film, but at least the plot would be comprehensible. An alternative would be to record the film on the computer in the adapter, break it down into individual images, transfer them individually through the tube and reassemble them on a target system. This would not be in real time, but the result would be smooth and better image quality would also be possible. Let's calculate this using an example: a film with a duration of two hours consists of 180,000 individual frames at a frame rate of 25 frames per second. In SSTV mode *Martin M1*, a colour image with 320 x 256 pixels can be transmitted in 114 seconds. This means that the transmission of the entire film takes about 20.5 million seconds or 237.5 days, plus two hours for the transmission of the soundtrack – or four if we want stereo. With these limitations, it is unlikely that film transmission via garden hoses will become widely accepted. At least now there is proof that an adapter from HDMI to Gardena is technically feasible, quite simple in design and easy to replicate.