Sony AVC-3420CE / AVC-3450CE digital video capture

Hi there, my name is Miki and I’m grateful to have found this forum. I’ve recently developed an interest in analog video and enjoy gaining knowledge in new fields through experimenting and researching specific topics. Although I have no background knowledge in electronics, I have become interested in two cameras: the Sony AVC-3420CE and AVC-3450CE. When I acquired them, I thought about giving them a second life by capturing their output digitally with my ThinkPad. I want to create some kind of modern Portapak. Therefore, I started researching and stumbled upon videolabguy, who makes great videos and has a website with lots of useful information. Especially useful to me is his video series about sync generators and these cameras.

Here is the link to one of his videos: Sync Generator Projects, Part 3 - Sony Cameras AVC-3400 & 3450 - YouTube

However, I haven’t found much information beyond this, so I started anyway. I got myself a 10 Pin EIAJ connector to connect the cameras and made a little contraption to give the camera power through pins 10 and 9, get the video output through pins 1 and 2, and give the camera the vertical and horizontal sync signal through pins 3, 4, and 5.

My first goal was to power these cameras up, which was fairly easy. I got myself a 12V battery to use them on the go, but for testing purposes, I used a 12V power adapter. And well, they seem to work :slight_smile: But this was just the easy part.

The next step was to capture the video output. I got myself a Hauppauge USB-Live2 capturing device and soldered a Cinch connector to pins 1 and 2. When I tried to capture the video signal with OBS, there were massive dropouts, and the capture device seemed to be overwhelmed by the heavily distorted video signal. When I connected the camera directly to a CRT, the image was heavily distorted, but at least it produced an image.

Of course, I knew about the missing sync signal, so I started searching for a portable sync generator and came across this tutorial for generating video sync signals with an Arduino: Tutorials for generating video sync signals with arduino

So I got myself an Arduino Nano and tried to program a microcontroller for the first time. This is where I’m somewhat stuck now. Unfortunately, I don’t have an oscilloscope, but I managed to program the Arduino and feed both sync signals to the camera. I tried to adapt the code to the needs of the cameras. According to the service manual, it needs these two sync pulses:

For the horizontal sync, I calculated:
- whole line: 63.5 microseconds: 63.5 * 16 = 1016 cycles
- h sync pulse: 10 microseconds: 10 * 16 = 60 cycles

The vertical sync is almost identical, with 16.7 milliseconds, so I kept the values the same:
- whole frame* → 525 lines
- v sync pulse → 2 lines

The output shown in OBS changed, but it still was heavily distorted.

Therefore, I tried different values at random and came across this combination that produced the best result yet, but it’s still not pleasing.

#define LINE_CYCLES 625
#define HSYNC_CYCLES 50
#define VSYNC_LINES 2
#define FRAME_LINES 505

#define VSYNC_HIGH bitWrite(PORTD, 7, 1)
#define VSYNC_LOW bitWrite(PORTD, 7, 0)

volatile int linecount;

void setup() {
    pinMode(7, OUTPUT); // VSync
    pinMode(9, OUTPUT); // HSync
    //inverted fast pwm mode on timer 1
    TCCR1A = _BV(COM1A1) | _BV(COM1A0) | _BV(WGM11);
    TCCR1B = _BV(WGM13) | _BV(WGM12) | _BV(CS10);

    ICR1 = LINE_CYCLES; // Overflow at Cycles per line
    OCR1A = HSYNC_CYCLES; // Compare high after HSync cycles

    TIMSK1 = _BV(TOIE1); // Enable timer overflow interrupt
}

ISR(TIMER1_OVF_vect) {
    switch(linecount) {
        case 0:
            VSYNC_LOW;
            linecount++;
        break;
        case 2:
            VSYNC_HIGH;
            linecount++;
        break;
        case FRAME_LINES:
            linecount = 0;
        break;
        default:
            linecount++;
    }
}

void loop(){}

Just to rule out a faulty capturing device and to test my contraption, I tried a more modern Philips Video 400 Camera, which has an integrated sync signal, so I don’t need to feed an external sync signal with the Arduino. And it works flawlessly.

This is where I currently stand. I hope I have described everything comprehensibly. However, I’m unsure what to do next. Is an additional device needed to get rid of the distortion? Could the Arduino be producing a faulty sync signal? Is the idea even feasible?

Any advice or guidance would be greatly appreciated, and I apologize if I have made any simple or obvious mistakes. Nonetheless, it has been a fun learning experience so far :slight_smile:

1 Like

it seems like you might be testing things out with too many moving parts in unknown condition at the moment, i’d recommend breaking this up into a couple of projects in order to isolate important steps.

  1. make sure the cameras are working properly
  2. generating sync signals from arduino
  3. analog->digital video capture

When I connected the camera directly to a CRT, the image was heavily distorted, but at least it produced an image.

Theres a pretty good chance yr cameras would need a little bit of recapping depending on how they were used and stored. if so, then it doesn’t matter how precise yr sync generator is, the video will still be fucked. And depending on how the video is distorted you won’t stand a chance of getting something clear with any kind of video adc. If you start out with making sure you’ve got a video signal at the top of yr chain that is up to broadcast standards then the rest of the steps will be easier.

re: arduino sync generators, i know rob uses them in their stuff. id recommend reaching out to them if you think you’ve got issues with that side of things. fair warning, the first thing they will tell you is to get an oscilloscope b/c otherwise yr kind of in the dark.