@TubularCorporation …are all those machines part of a same setup like this?
I wish. I’m visiting family so I had the cabinet shipped to their house instead of my apartment and put it all together here; the photo is in their spare room. The amp, pedal, box of cables, l-pad, guitar and pants are mine. The sweet, 1980s, Sanyo fan my dad found about 10 years ago and I only WISH it was mine.
A few months ago, we got committed our first music video, and we are now working on a second one for another music act.
We’ve still been streaming every Wednesday for the Daddies On Acid and Interzone Berlin-based parties, and we keep uploading all the recordings to the Internet Archive.
As soon as the lockdowns eased, we got back to performing live at techno parties, also establishing a new collaboration with the music collective Art Bei Ton.
A few days ago, we achieved our dream of making visuals all night long at the Gegen party inside Kit Kat, an infamous fetish techno club.
Next Friday, we will support the legendary synthesizer band Transistors Of Mercy as part of the local festival Krake 2021.
More VJ sets at techno parties are planned for November. In December we will also begin collaborating with Autonoma Industriale, whose events feature techno industrial music, often played live on synthesizers.
hello again! I’ve been trying to spend more time outside before it gets too cold, so haven’t really paid attention to any video related projects, but last week it finally was rainy and (kinda) cold, so I stayed home and tried latest release candidate version of vvvv gamma since they had some improvements on the parts I’m interested in and it went pretty well. I did a video feedback loop using some vector graphics as a texture input for 3d engine where I piped some filters in a loop. All was rendered in real time while listening to Xenakis’s La Legende D’Eer, so was kind of trippy and matching the music I suppose. Since vvvv is a patchable real-time environment I’m thinking to replicate an lzx style analog synth, but with a software modules to patch. Needles to say it looks much nicer when rendered rather than captured and published on yt. Any feedback is appreciated (to create a loop haha)
when the weather traps you inside with synthesizers, make synthesizer-ade. also, yes! xenakis!
it has been a busy fall. had a lot of shows in a short period of time, one of which was with phase space where we visited a university about 3 hours out of the city (the first time we have done that type of roaming video workshop/event). i also did an artist talk, where i talk about a lot of ideas and things that i’ve been working on over the past year – it was live-streamed for ljudmila lab in slovenia, and the archive is up here:
Because my algorithmically generated images look too clean, I try to counteract this. So up to now I have been using footage from the real world in the algorithms to break the clean look.
Now I have started to disturb the too clean output on the digital monitor by filming it. For example with an unfocused camera, or I hang a foil between the monitor and the camera, or film through one cheap plastic lens. The most surprising shots were taken when I had glass parts lying on the monitor. Several of these shots are put together here as showcases.
woah
Hi!
I’d like to contribute to this post full of wonders.
it’s a videoclip made for a friend track called Divisione di Cassini (Cassini's Division - Wikipedia)
Peace!
I shared these pictures in another thread, but it was my last experiment. A video bent video tech titler thing. It worked out pretty well, and I’ll share a video of it in action soon.
In other news… I am trying to set up a more ‘fixed’ studio setup for my video gear with a bank of CRTs and more sensible routing etc. I’ve taken on a couple of commissions for bands and musicians which has been fun, and I’m experimenting with different ways to make the footage stand out as opposed to just my usual glitchy pish.
Colliding with the Monitor
Here I have worked with different feedback loops.
One approach is to copy the content from X11 windows to be distorted in realtime with digital image filters. These programs serve as sources for the image stream: Bespoke’s help text, ImageMagick, VLC, SuperCollider, and various apps from the KDE desktop.
Additionally, in some scenes the screen is filmed with a WEB camera and brought back to the process via Gucview to aNa.
In both cases the image data flows through “analog Not analog” (aNa). There is then both: Internal feedback loops in the software components and external feedback when the camera follows the content on the screen.
Recording is divided into short sequences. Then the individual sections are put together to form these animations.
Reminder this thread exists, where we can discuss/explain what we are working on.
You can alternatively use this other thread if you simply want to post a video: Post some video link/embeds of your stuff here!
I’ve got a couple of fun things happening on my docket these days
-
working through the necessary steps to get a working beta version of pi4 VSERPI out in the world. ultimate goal here is to get something available & then create a voluminous set of video n text documentation to make it as painless as possible for anyone to get one up and running, and then phase out the whole silly selling pre built units thing entirely so i’m can finally realize my goal of moving into a cave some day
-
working on a new a/v album with the absurd working title of ‘tantric acid.’ i finished up nearly all the audio stuff start to finish in about 2 weeks when i had a cold about a month ago and have been testing out some concepts for the video part in video waaaves as well. a little excerpt from this project will be screened at the scanlines streaming party on 4-20.
-
teaching myself verilog, vhdl, digital design fundamentals, & low level DSP biz for fpga stuff. I got about as far as i could with all the available vga test pattern stuffs i could find out there and then realized that I just didn’t have enough of a foundation in the digital design sides of things to do much more experimentation with so kinda thinking about bouncing back and forth from the vga video oscillator project i have rn and experimenting with a lot of lower level 1d signal stuff first. signed up for some online classes in hdl image processing using larger fpga+arm dev board stuffs too so trying to come at this project from many different angles
-
developing and refining more classes i can offer in irl and online. teaching the coding classes last month was pretty fun but i’d like to be able to offer some more online stuffs to people who have literally zero interest in coding, hence the magick & narrative class which i’ve been tossing around in my brain for a couple years now. thinking about leading some ‘bootcamp’ style classes at phase space this summer too, something where we walk through steps from start to finish of making like some kind of 5 minute long video art/music video kind of thing over the course of like a week or something.
-
bringing some necesssary updates into the desktop video waaaves world, some that i directly want to be using for the tantric acid project. mainly adding more mixing modes (cleaner luma key + chroma key with threshold lines, add, subtract, and mult) and working out best way to have two individual delay lines to draw from + a more flexible mixing path
After a requisite period mourning the demise of Racer Trash, a number of us have re-formed as Dream Video Division and announced our first two projects: first, a forthcoming pirate broadcast of a remixed LOST, transmitting from a secret location in the Cayman Islands. Second, we are screening NOSFERA2, with an accompanying art installation, at the Overlook Film Festival in New Orleans in June.
…after last months Youtube-livestream ambioSonics live session 133 - Glockenbachwerkstatt München - YouTube - experimenting with a Sleepy Circuits Hypno in the background - was an embarrassing experience - as for example more than half of the cameras just didnt want to work that evening and more importantly my OBS-scenes were not set up (OBS was not set up at all as i was expecting the delivery of a Yolobox until the same afternoon - which did not happen, the delivery i mean) - i am now in the progress of building a complete new streaming setup paired with an ‚analog‘ based visuals-setup (at the moment including besides the Hypno an Edirol P-10 and a Roland V-8 as well as two Sony Camcorders - for starters; there are many-many more (DIY-) toys floating around in my cellar waiting to be included in the visuals) for our next session on may 9th…
…so much fun, so little time…
The Roland P-10 seems like the perfect hardware back up to have some clips ready to go. I have one but always unusual to not see them used more often.
I record the footage at 240 fps. Then the films are converted to 24 fps. The themes are flowing water, flying insects, fabric ribbons in the wind…
This is the result of post-processing the slow-motion shots with openFrameworks and Kdenlive.
dystopian bees
jets of a fountain II
fabric ribbons in the wind
Been enjoying processing footage of video games through framebuffers - !