Whats everyone working on these days?

was this looks/sounds awesome, now i desperately wanna see the patches

1 Like

Setting up the video junk after WAY too long in various cupboards and boxes

2 Likes

Hey,

Looks very interesting. It seems that you are making a cartridge for the genesis, right? Would It be very difficult to make a ROM of It, so It can be used on emulators?

If it’s like a videogame, dump It to a ROM file like any other videogame.

Thank you very much for your attention.

1 Like

Hi @Oskar !

Not difficult to make a ROM file, if I did though, it would not work on an emulator.

The game/ROM needs the special game cartridge I’m designing to make it work.

You plug your music into the cartridge, it takes data from the music and uses that data to “play” the game, which really means it uses the data to make visuals that move with the music on the TV screen.

If you play the rom on an emulator, there is no place to plug your music into and no hardware to process the music into data to make the visuals move.

I hope that makes sense!

Did a 12 hr ambient video projection set on the ceiling of a space in Berkeley recently… lot of fun to try something new! Here’s some captures my friend Cat made…




3 Likes

Silly tune/video I made. Getting into new habits other than fatherhood. It ultimately was an excuse to have something to do with waaave_poolhd. So here’s my maiden voyage with waaave_poolhd.

3 Likes

I made this video art piece for the ancient mantra Oṃ maṇi padme hūm
in Hindi and Sanskrit - ॐ मणिपद्मे
In Tibetan हूँ ཨོཾ་མ་ཎི་པདྨེ་ཧཱུྃ

Considered the essence of Tibetan Buddhism, it is a Sanskrit mantra, which also was incorporated into the Chinese Tao. Mad ancient, and very powerful.

2 Likes

I finished my new mobile setup for 2025 which consists of a Sleepy Circuits Hypno, Waaave_Pool, and V-02HD (plus a Hydrasynth for CV and MIDI stuff) all stuffed inside a flight case to accompany my hardware techno (I call it my HYPNO_POOL setup). For my pilot project (and to get used to the Waave_Pool) I made a mashup of Quinten Tarantino films to one of my songs.

2 Likes

…made a ‚mostly harmless‘ little piece for the weekly Hardware Jams Weekend Challenge in the morning and a bit of lunch-time…

…the challenge topic is ‚make a trippy acid track and video in honor of march 3rd, the 303 day‘…

…music made with MAM MB33 Mk II, Sonicware CyDrums and Matt Bradshaws drumki, video was done

Resolume > 2x BlackMagi-Converter > LZX TBC2 > Erogenous Tones Structure > Syntonie Stable > Syntonie CBV1 > Syntonie Stable > Syntonie VU007B > Magewell DVI+ > Resolume…

2 Likes

I’ve been working on this portable passion project. I’m about 95% finished and will soon make a post of the build.

4 Likes

in my increasingly-sparse free time, I’ve been playing with moire patterns and images, as I’ve been consistently interested in both moire patterns and collage. Here’s a morphing moire pattern created with a moire generator I wrote in openFrameworks (shoutout to @LiquidLightLab for inspiration), keying two static images.

3 Likes

Video and audio recorded from homemade video synthesizer. Audio captured directly from the RGB signal in the synth. Further processed in DaVinci Resolve and Ableton Live.

7 Likes

I’m not quite sure where to put this so I will put this here.
I recently picked up on the hydra video synth for running it locally to play around. What bothered me was that there was no way of getting a second output from the visuals.
So with the help of vibe coding it’s now possible (even tough hacky). I have no clue about coding with javascript but so far it seems to work

What the language model came up with is modifying the index.js file. All I did was copy this code at the bottom of the file

// Setup the mirror after a short delay (ensure canvas is ready)
setTimeout(() => {
  setupMirrorWindow()
}, 500)

function setupMirrorWindow() {
  const sourceCanvas = document.querySelector('canvas')

  if (!sourceCanvas) {
    console.error("Hydra canvas not found")
    return
  }

  const win = window.open('', 'HydraOutput', 'width=1920,height=1080')

  if (!win) {
    console.error("Popup blocked or failed to open")
    return
  }

  // Inject minimal HTML and CSS
  win.document.write(`
    <!DOCTYPE html>
    <html>
    <head>
      <style>
        html, body {
          margin: 0;
          overflow: hidden;
          background: black;
        }
        canvas {
          display: block;
          width: 100vw;
          height: 100vh;
        }
      </style>
    </head>
    <body>
      <canvas></canvas>
    </body>
    </html>
  `)

  const newCanvas = win.document.querySelector('canvas')
  const ctx = newCanvas.getContext('2d')

  function resizeCanvas() {
    newCanvas.width = win.innerWidth
    newCanvas.height = win.innerHeight
  }

  win.addEventListener('resize', resizeCanvas)
  resizeCanvas()

  function drawFrame() {
    ctx.clearRect(0, 0, newCanvas.width, newCanvas.height)
    ctx.drawImage(sourceCanvas, 0, 0, newCanvas.width, newCanvas.height)
    requestAnimationFrame(drawFrame)
  }

  drawFrame()
}

and then run hydra locally and it just worked :slight_smile:

Now when you open the localhost hydra website a popup window will appear that only mirrors the visual output of your code. When you modify your hydra code it will automatically update the popup window.

3 Likes

The last month I’ve been trying to get compressure to work in real-time, to make it more like a video instrument. I’m not totally there yet, but I’ve made some big progress. I’m now able to control it live with a short but noticeable lag. The shorter experimental cycle allows me to find new patterns and tendencies from different content, encoding, and frame navigation. Here’s one such experiment

For those wondering what’s happening under the hood: what you’re seeing is a sort of cross between datamoshing and granular synthesis based on hacking the digital video compression algorithm. The input is a video of a camera moving through a forest, and the first second or so shows exactly that (albeit for 1/2 second repeated a few times). After the first few samples, we track forward a few frames and let the motion vectors and error correction encoded in the video file feedback into each other and produce this melty feedback mess.

Happy to answer any questions.

7 Likes

that is freaking COOL! ^ _ ^ i’m definitely lost in the woods now without any way out

1 Like

here it is! finally. a tribute to the legendary Aphex Twin!

3 Likes

Here are some experiments combing my moire generator and compressure, giving some interesting textures and movement.

2 Likes