Streaming Deck Removes Need For Dedicated Hardware

Streaming content online has never been more popular than it is now, from YouTube to Twitch there are all kinds of creators around with interesting streams across a wide spectrum of interests. With that gold rush comes plenty of people selling figurative shovels as well, with audio mixing gear, high-quality web cams, and dedicated devices for controlling all of this technology. Often these devices take the form of a tablet-like device, but [Lenochxd] thinks that any tablet ought to be able to perform this task without needing dedicated, often proprietary, hardware.

The solution offered here is called WebDeck, an application written in Flask that turns essentially any device with a broswer into a stream control device. Of course it helps to have a touch screen as well, but an abundance of tablets and smartphones in the world makes this a non-issue. With the software running on the host computer, the streamer can control various aspects of that computer remotely by scanning a QR code which opens a browser window with all of the controls accessible from within. It has support for VLC, OBS Studio, and Spotify as well which covers the bases for plenty of streaming needs.

Currently the host software only runs on Windows, but [Lenochxd] hopes to have MacOS and Linux versions available soon. We’re always in favor of any device that uses existing technology and also avoids proprietary hardware and software. Hopefully that’s a recipe to avoid planned obsolescence and unnecessary production. If you prefer a version with a little bit of tactile feedback, though, we’ve seen other decks which add physical buttons for quick control of the stream.

Unlimited Cloud Storage YouTube Style

[Adam Conway] wanted to store files in the cloud. However, if you haven’t noticed, unlimited free storage is hard to find. We aren’t sure if he wants to use the tool he built seriously, but he decided that if he could encode data in a video format, he could store his files on YouTube. Does it work? It does, and you can find the code on GitHub.

Of course, the efficiency isn’t very good. A 7 K image, for example, yielded a 9-megabyte video. If we were going to store files on YouTube, we’d encrypt them, too, making it even worse.

The first attempt was to break the file into pieces and encode them as QR codes. Makes sense, but it didn’t work out. To get enough data into each frame, the modules (think pixels) in the QR code were small. Combined with video compression, the system was unreliable.

Simplicity rules. Each frame is 1920×1080 and uses a black pixel as a one and a white pixel as a zero. In theory, this gives about 259 kbytes per frame. However, to help avoid problems decoding due to video compression, the real bits use a 5×5 pixel block, so that means you get about 10 kbytes of data per frame.

The code isn’t perfect. It can add things to the end of a file, for example, but that would be easy to fix. The protocol could use error correction and compression. You might even build encryption into it or store more data — old school cassette-style — using the audio channel. Still, as a proof of concept, it is pretty neat.

This might sound like a new idea, but people way back in the early home computer days could back up data to VCRs. This isn’t even the first time we’ve seen it done with YouTube.

Digital Master Tapes Seek Deck

As a nerdy kid in the 90s, I spent a fair bit of time watching the computer-themed cartoon Reboot. During the course of making a documentary about the show, [Jacob Weldon] and [Raquel Lin] have uncovered the original digital master tapes of the show.

This is certainly exciting news for fans of the show, but there’s a bit of a wrinkle. These digital masters are all on D-1 digital cassette tapes which the studio doesn’t have a player for anymore. The dynamic duo are on the hunt for a Bosch BTS-D1 to be able to recapture some of this video for their own film while also heavily hinting to the studio that a new box set from the masters would be well-received.

As the first CGI TV series, Reboot has a special place in the evolution of entertainment, and while it was a technical marvel for its time, it was solid enough to last for four seasons and win numerous awards before meeting a cliffhanger ending. If you’re an expert in D-1 or have a deck to lend or sell, be sure to email the creators.

Feeling nostalgic for the electromechanical era? Why not check out some hidden lyrics on Digital Compact Cassettes (DCC) or encoding video to Digital Audio Tapes (DAT)?

[via Notebookcheck]

Video Feedback Machine Creates Analog Fractals

One of the first things everyone does when they get a video camera is to point it at the screen displaying the image, creating video feedback. It’s a fascinating process where the delay from image capture to display establishes a feedback loop that amplifies the image noise into fractal patterns. This sculpture, modestly called The God Machine II takes it to the next level, though.

We covered the first version of this machine in a previous post, but the creator [Dave Blair] has done a huge amount of work on the device since that allows him to tweak and customize the output that the device produces. His new version is quite remarkable, allowing him to create intricate fractals that writhe and change like living things.

The God Machine II is a sophisticated build with three cameras, five HD monitors, three Roland video switchers, two viewing monitors, two sheets of beam splitter glass, and a video input. This setup means it can take an external video input, capture it, and use it as the source for video feedback, then tweak the evolution of the resulting fractal image, repeatedly feeding it back into itself. The system can also control the settings for the monitor, which further changes the feedback as it evolves. [Blair] refers to this as “trapping the images.”

Continue reading “Video Feedback Machine Creates Analog Fractals”

three resin-printed Single8 film cartridges, uncropped image

Re-Inventing The Single 8 Home Movie Format

[Jenny List] has been reverse-engineering and redesigning the Single8 home movie film cartridge for the modern age, to breathe life into abandoned cine cameras.

One of the frustrating things about working with technologies that have been with us for a while is the proliferation of standards and the way that once-popular formats can become obsolete over time.  This can leave equipment effectively unusable and unloved.

There is perhaps no greater example of this than in film photography – an industry and hobby that has been with us for over 100 years and that has left many cameras orphaned once the film format they relied on was no longer available (Disc film, anyone?).

Thankfully, Hackaday’s own [Jenny List] has been working hard to bring one particular cine film format back from the dead and has just released the fourth instalment in a video series documenting the process of resurrecting the Single8 format cartridge. Continue reading “Re-Inventing The Single 8 Home Movie Format”

Bringing Back The CRT TV Experience In Software

Cathode-Retro is a collection of shaders and sample C++ code for reliving the glorious days when graphics were composite video signals displayed on a CRT screen. How? By faking it in software and providing more configuration options than any authentic setup ever had.

Love it or don’t, there’s nothing quite like it.

Not satisfied with creating CRT-style color images with optional scanlines and TV picture controls like tint and saturation, Cathode-Retro can emulate more nuanced elements as well.

The tool includes the ability to imitate things like the slight distortion of a period-correct curved screen, the subtle effects of different methods CRT displays used to actually work (such as shadow mask vs aperture grille), and even taking into account the slight distortion of light refracting imperfectly through the glass face of the CRT. There’s even options for adding noise and ghosting, which may spark some artistic ideas.

If all you need is software to recreate an old-school CRT terminal, we have you covered. But if your needs are a bit more low-level, Cathode-Retro might be what you’re missing.

OpenMV Promises “Flyby” Imaging Of Components For Pick And Place Project

[iforce2d] has an interesting video exploring whether the OpenMV H7 board is viable as a flyby camera for pick and place, able to quickly snap a shot of a moving part instead of requiring the part to be held still in front of the camera. The answer seems to be yes!

The OpenMV camera module does capture, blob detection, LCD output, and more.

The H7 is OpenMV‘s most recent device, and it supports a variety of useful add-ons such as a global shutter camera sensor, which [iforce2d] is using here. OpenMV has some absolutely fantastic hardware, and is able to snap the image, do blob detection (and other image processing), display on a small LCD, and send all the relevant data over the UART as well as accept commands on what to look for, all in one neat package.

It used to be that global shutter cameras were pretty specialized pieces of equipment, but they’re much more common now. There’s even a Raspberry Pi global shutter camera module, and it’s just so much nicer for machine vision applications.

Watch the test setup as [iforce2d] demonstrates and explains an early proof of concept. The metal fixture on the motor swings over the camera’s lens with a ring light for even illumination, and despite the moving object, the H7 gets an awfully nice image. Check it out in the video, embedded below.

Continue reading “OpenMV Promises “Flyby” Imaging Of Components For Pick And Place Project”