Advertise here




Advertise here

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Sign In with Google Sign In with OpenID

AVCaptureSession sample buffer to pixels inconsistent values

revgrevg Posts: 71Registered Users
edited October 2011 in iPhone SDK Development
I have the following code where I am grabbing all captured frames from the iphone video camera and trying to loop through all the pixels in each image.

I wrote the code below, but for some reason I get slightly different color values when I use my RGBPixel structure method versus just grabbing the pixels colors straight from the uint8_t buffer that I get from CVPixelBufferGetBaseAddress. I have been pulling my hair out to figure out what I might be doing wrong, I don't see why the r,g,b values are not the same for each method. They should be.

What I want to do is get the average r,g,b values for the entire image, but I need it to be accurate and I can't figure out why my results are not consistent. Can anyone look at my code and see what I might be doing wrong?

Thanks!
typedef unsigned char byte;
typedef struct RGBPixel{
    byte red, green, blue;
} RGBPixel;

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
	   fromConnection:(AVCaptureConnection *)connection 
{ 
	NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
	
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    uint8_t *src_buff = (uint8_t*)CVPixelBufferGetBaseAddress(imageBuffer);
    
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    RGBPixel *pixelData = (RGBPixel *)src_buff;

    int len = bytesPerRow * height;
    for(int i=0; i<len; i+=4){
        
        RGBPixel pixel = pixelData[i/4];
        
        int a = 0;
        int r = pixel.red;
        int g = pixel.green;
        int b = pixel.blue;

        NSLog(@"first values = r:%d g:%d b:%d", r, g, b);
        
        a = src_buff[i+3];
        r = src_buff[i+2];
        g = src_buff[i+1];
        b = src_buff[i];

        NSLog(@"second values = r:%d g:%d b:%d", r, g, b);
 
    }

[pool drain];
}
Post edited by revg on

Replies

  • revgrevg Posts: 71Registered Users
    edited October 2011
    I hate answering my own question... but hey that's the best way to figure out the answer right? spend 15 hours try to fix the bug, finally break down and post it and like clock work you will solve it yourself minutes after posting ;)

    problem was my image was in BGRA format so I had the red and blue variables in my struct in wrong places. needs to be like below
    typedef struct RGBPixel{
        byte blue, green, red;
    } RGBPixel;
    
  • aoredssonaoredsson Posts: 26Registered Users
    edited October 2011
    Then it should be called BGRPixel, not RGBPixel. I have done a lot of video capturing on the iPhone, and i often run into the BRG-format. Anybody know if its the native video format, or why Apple is using it all the time?

    Basically you have the following format combinations: RGB or BGR, either without an A(lpha), with an A in front or an A in the back. At least when capturing / rendering stuff.
Sign In or Register to comment.