Advertise here




Advertise here

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Merging content of UIImageView and EAGLview

NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
edited March 2011 in iOS SDK Development
Hello,

I'm almost done with my current project, but somehow can't solve the following problem.

My app uses parts of the GLPaint demo app, to allow the user to draw on the iPhone screen. The EAGLview is set to transparent, using
eaglLayer.opaque = NO;

Below I have a UIImageView, which holds an image. This all looks great, but I haven't been able to merge both into one image, that can be saved to the photo library.

Here is what I tried so far:

1) Trying to make a simple screenshot
UIGraphicsBeginImageContext(self.window.bounds.size);
[self.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
	
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);

This doesn't work, as it will only grab contents of the UIImageView, which is below my EAGLview.

2) Grabbing content of the EAGLview and putting it in a new UIImageView, which is positioned above the old UIImageView:
- (CGImageRef) glToUIImage {
    NSInteger myDataLength = 320 * 480 * 4;
	
    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(0, 0, 320, 480, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
	
    // gl renders "upside down" so swap top to bottom into new array.
    // there's gotta be a better way, but this works.
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
    for(int y = 0; y <480; y++)
    {
        for(int x = 0; x <320 * 4; x++)
        {
            buffer2[(479 - y) * 320 * 4 + x] = buffer[y * 4 * 320 + x];
        }
    }
	
    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
	
    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * 320;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
	
    // make the cgimage
    CGImageRef imageRef = CGImageCreate(320, 480, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
	
    // then make the uiimage from that
    UIImage *myImage = [UIImage imageWithCGImage:imageRef];
    return myImage;
}
This returns content of the EAGLview in a UIImage. Works fine, except for one major problem: There is no transparency. So, parts that are transparent are black.

3) Grabbing content of EAGLview (like before) and merging it with content of my UIImageView, using quartz.
        CGImageRef		brushImage;
	CGContextRef	brushContext;
	GLubyte			*brushData;
	size_t			width, height;
	
	brushImage =  [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@imageOfUIImageView ofType:@png]].CGImage;
	
	void *          bitmapData;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;
	CGColorSpaceRef colorSpace;
	bitmapData = malloc(bitmapByteCount);
	
	bitmapBytesPerRow   = (320 * 4);// 1
    bitmapByteCount     = (bitmapBytesPerRow * 480);
	colorSpace = CGImageGetColorSpace(image);
	
	
	
	width = CGImageGetWidth(brushImage);
	height = CGImageGetHeight(brushImage);
	brushData = (GLubyte *) malloc(320 * 480 * 4);
	// Use  the bitmatp creation function provided by the Core Graphics framework. 
	
	brushContext = CGBitmapContextCreate (bitmapData,
										  320,
										  480,
										  8,      
										  bitmapBytesPerRow,
										  colorSpace,
										  kCGImageAlphaPremultipliedFirst);
	
    
	// After you create the context, you can draw the  image to the context.
        // image is a UIImage, that holds the image, returned from the glToUIImage method above
        // Draw the EAGLview content (now in image) to the context
	CGContextDrawImage(brushContext, CGRectMake(0.0, 0.0, (CGFloat)320, (CGFloat)480), image.CGImage);
	
	//Set the blend mode
	CGContextSetBlendMode (brushContext, kCGBlendModeLighten); //lighten seems to give the "best" result
        //draw the background to the context
	CGContextDrawImage(brushContext, CGRectMake(0.0, 0.0, (CGFloat)320, (CGFloat)480), brushImage);
	
	//create a new image, that combines both EAGLview and my background image
	CGImageRef mergedImage = CGBitmapContextCreateImage(brushContext);	
	CGContextRelease(brushContext);
	//write it to photolibrary
	UIImageWriteToSavedPhotosAlbum([UIImage imageWithCGImage:image], self, nil, nil);

This doesn't work either. Depending on the blend mode I choose, the result looks sometimes better, sometimes worse, but is never close to what I need. On the other hand, if I simply push the home and on/off button to make a screenshot manually, the result is exactly what I need.

Maybe you have some ideas, how this can be achieved. It's really the final touch for my app and I definitely don't want the user to make the screenshot manually.

Any help appreciated! Thanks in advance.
Post edited by NewiPhoneDeveloper on
Websites:<br />
<a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
<a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
<br />
iPhone Apps: <br />
<a hr
«1

Replies

  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited June 2009
    I guess I can solve the problem, if someone can tell me how to get content of my EAGLview WITH an alpha channel, so transparent parts will no longer appear black. Please check out the glToUIImage method above, maybe something can be modified to make it work.

    Thanks.
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited June 2009
    I've done more research on this topic, but haven't had much luck yet. I've nailed it down to the fact, that whenever I try to grab the content of my EAGLview, using the glToUIImage method (see above), it completely ignores the alpha values. This has been discussed on other forums as well, but noone has been able to answer it yet.
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • digicidedigicide Posts: 5New Users
    edited July 2009
    There is a private method UIGetScreenImage() that will take a screenshot programmatically. Officially Apple doesn't allow private methods to be used, but in this case I doubt they'd notice. Here's a usage example (from code by Rob Terrell)
    + (UIImage *)imageWithScreenContents
    {
        CGImageRef cgScreen = UIGetScreenImage();
        if (cgScreen) {
            UIImage *result = [UIImage imageWithCGImage:cgScreen];
            CGImageRelease(cgScreen);
            return result;
        }
        return nil;
    }
    

    Now that we have that sorted, PLEASE tell me how you were able to use GLPaint above a UIImage. No matter what I try, the blending seems to be off and there's either dark edges to the lines, or the colors are inverted... I've been at this several days now.
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited July 2009
    digicide wrote: »
    There is a private method imageWithScreenContents() that will take a screenshot programmatically. Officially Apple doesn't allow private methods to be used, but in this case I doubt they'd notice. Here's a usage example (from code by Rob Terrell)
    + (UIImage *)imageWithScreenContents
    {
        CGImageRef cgScreen = UIGetScreenImage();
        if (cgScreen) {
            UIImage *result = [UIImage imageWithCGImage:cgScreen];
            CGImageRelease(cgScreen);
            return result;
        }
        return nil;
    }
    

    Now that we have that sorted, PLEASE tell me how you were able to use GLPaint above a UIImage. No matter what I try, the blending seems to be off and there's either dark edges to the lines, or the colors are inverted... I've been at this several days now.

    Actually, I even managed to work around that private API and get the same effect, using officially allowed methods only :)

    Anyways, first things first. To show the UIImage below your EAGLView, you have to do 2 things:

    1) in your EAGLView.m set:
    eaglLayer.opaque = NO;
    //I'm using EAGLView version 1.6 - can be found in -(BOOL) _createSurface method
    
    This will make the EAGLView transparent.

    2) In GLPaint PaintingView gets directly called from the AppDelegate. Therefore it lays on the AddDelegate's window. Now simply set your image as your window's background image, like:
    UIImage *patternImage = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@nameOfYourImage ofType:@png]];
    [self.window setBackgroundColor:[UIColor colorWithPatternImage:patternImage]];
    

    Hope, that helps...
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited July 2009
    @digicide: feel welcome to PM me, if you need more detailed information beyond this thread and also allow others to send you PMs (system told me, you disabled it).
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • digicidedigicide Posts: 5New Users
    edited July 2009
    It sounds like a good idea... I'm still having blending problems with it, though. I get a result like this:

    sample.png

    The color is supposed to be a gray color and looks right when I have a black background instead of an image. Did you change the glBlendFunc or use a different particle image?

    Edit: Sorry about the PM'ing... my account seems to be screwed up. I have 1 of 0 allowed PM's and whenever I try to view my options or details it tells me I don't have permission.
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited July 2009
    digicide wrote: »
    It sounds like a good idea... I'm still having blending problems with it, though. I get a result like this:

    sample.png

    The color is supposed to be a gray color and looks right when I have a black background instead of an image. Did you change the glBlendFunc or use a different particle image?

    Yes, I created my own particle in photoshop. It's quite simple actually. Create a new image (RGB color, 8 Bit, 64x64 pixels). Use a soft basic brush to create a pattern like this:

    particle.png

    I don't mind, if you use mine :)
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • digicidedigicide Posts: 5New Users
    edited July 2009
    Thanks for working with me on this--
    With your image I get this:

    sample2.png

    I suspect I'm loading the texture wrong, perhaps on this line?
    brushContext = CGBitmapContextCreate(brushData, width, width, 8, width * 4, CGImageGetColorSpace(brushImage), kCGImageAlphaPremultipliedLast);
    
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited July 2009
    digicide wrote: »
    Thanks for working with me on this--
    With your image I get this:

    sample2.png

    I suspect I'm loading the texture wrong, perhaps on this line?
    brushContext = CGBitmapContextCreate(brushData, width, width, 8, width * 4, CGImageGetColorSpace(brushImage), kCGImageAlphaPremultipliedLast);
    

    No, that seems fine. Try the following openGL settings in your init method:
    //Set up OpenGL projection matrix
    		glDisable(GL_DITHER);
    		glMatrixMode(GL_PROJECTION);
    		glOrthof(0, frame.size.width, 0, frame.size.height, -1, 1);
    		glMatrixMode(GL_MODELVIEW);
    		glEnable(GL_TEXTURE_2D);
    		glEnableClientState(GL_VERTEX_ARRAY);
    		glEnableClientState(GL_TEXTURE_COORD_ARRAY);
    		glEnable(GL_BLEND);
    		glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
    		glClearColor(0.0, 0.0, 0.0 ,0.0);
    		glColor4f(1.0, 1.0, 1.0, 1.0);
    		glEnable(GL_POINT_SPRITE_OES);
    		glTexEnvf(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
    
    
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • digicidedigicide Posts: 5New Users
    edited July 2009
    Some improvement... the changes I saw were the blending function and the alpha component of glColor4f-- I've had mine at .15 . The left picture here is the result with the alpha at .15, the right is with the alpha at 1. This will be acceptable, however I'd rather not lose the additive blending, where lines drawn over each other start to turn white.

    sample3.png

    Thanks again!
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited July 2009
    digicide wrote: »
    Some improvement... the changes I saw were the blending function and the alpha component of glColor4f-- I've had mine at .15 . The left picture here is the result with the alpha at .15, the right is with the alpha at 1. This will be acceptable, however I'd rather not lose the additive blending, where lines drawn over each other start to turn white.

    sample3.png

    Thanks again!

    No problem :) I guess you won't get additive blending that way. What you're doing now, is "erasing" parts of your EAGLView and making it more or less transparent. So, the background image will shine through. For additive blending, you will have to draw on your EAGLView, like the GLPaint app does.
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • saransaran Posts: 10Registered Users
    edited August 2009
    Hi.. NewiPhoneDeveloper... I stumbled upon the same problem as digicide in an application i am building.. merging the UIImage with the EAGLview... u mentioned that u managed to merge the images using the 'legal' method...

    cld you pls share it....

    Thanx,
    Saran.
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited August 2009
    saran wrote: »
    Hi.. NewiPhoneDeveloper... I stumbled upon the same problem as digicide in an application i am building.. merging the UIImage with the EAGLview... u mentioned that u managed to merge the images using the 'legal' method...

    cld you pls share it....

    Thanx,
    Saran.

    Sure, let me try to remember. The whole thing is a bit complex but not really complicated, if you know openGL a bit. Maybe I'm not doing it the best way, but so far my method did a great job.

    Here is a rough step by step guide for merging an image (or content of a UIImage) with content of an EAGLview object:

    1) Create a new texture, using the texture2D class, like:
    _textures[kTexture_Clean] = [[Texture2D alloc] initWithImage: [UIImage imageNamed:@clean.png]];
    

    Mind, that "clean.png" is an image inside my project folder, that is just 320x480 pixels with nothing on it.


    2) Bind the new texture:
    glBindTexture(GL_TEXTURE_2D,[_textures[kTexture_Clean] name]);
    

    3) The following grabs the screen content INCLUDING ALL ALPHA CHANNELS and puts it on our CLEAN texture:
    glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, 320, 480);
    

    Right now you have our framebuffer on the new texture. So, all you have to do now, is get your background image into the scene.

    4) We're rebuilding the scene back to front. First we draw the background image - now using openGL:
    [_textures[kTexture_Background] drawInRect:[self bounds]]; //
    

    5) Draw the new texture (=Clean texture with our framebuffer on it) over the background image - blending enabled
    [_textures[kTexture_Clean] drawInRect:[self bounds]:0.0:0.0]; //
    

    Get the idea? Now you have everything in your scene and that also means, that you can now safely use glReadPixels to make a screenshot for you. To do so, simply call a method, like the one below:
    - (void)grabScreen {
         unsigned char buffer[320*480*4];
         glReadPixels(0,0,320,480,GL_RGBA,GL_UNSIGNED_BYTE,&buffer);
    	
    	CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, &buffer, 320*480*4, NULL);
    	CGImageRef iref = CGImageCreate(320,480,8,32,320*4,CGColorSpaceCreateDeviceRGB(),kCGBitmapByteOrderDefault,ref,NULL,true,kCGRenderingIntentDefault);
    	
    	 width         = CGImageGetWidth(iref);
    	 height        = CGImageGetHeight(iref);
    	size_t length        = width*height*4;
    	uint32_t *pixels     = (uint32_t *)malloc(length);
    	CGContextRef context = CGBitmapContextCreate(pixels, width, height, 8, width*4, CGImageGetColorSpace(iref), kCGImageAlphaLast | kCGBitmapByteOrder32Big);
    	CGContextTranslateCTM(context, 0.0, height);
    	CGContextScaleCTM(context, 1.0, -1.0);
    	CGContextDrawImage(context, CGRectMake(0.0, 0.0, width, height), iref);
    	CGImageRef outputRef = CGBitmapContextCreateImage(context);
    	UIImage *outputImage = [UIImage imageWithCGImage:outputRef];
    	
    	UIImageWriteToSavedPhotosAlbum(outputImage, nil, nil, nil); //YAY - SAVE IT :-)
    
           CGContextRelease(context);
           CGImageRelease(iref);
           CGDataProviderRelease(ref);
    }
    

    In case the above steps don't make any sense, I suggest to check out the CrashLanding app and GLPaint app. I adopted parts of that code, when learning from it.

    Hope, this helps...
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • saransaran Posts: 10Registered Users
    edited August 2009
    thank you for the reply....

    however, texture2d class is not supported any more...
    how do i go abt then?

    also... from what i understand... i will be redrawing the eaglview so the background will be drawn in also... so i will nt be able to change the background again w/o losing the drawings....

    Saran.
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited August 2009
    saran wrote: »
    thank you for the reply....

    however, texture2d class is not supported any more...
    how do i go abt then?

    also... from what i understand... i will be redrawing the eaglview so the background will be drawn in also... so i will nt be able to change the background again w/o losing the drawings....

    Saran.

    Texture2D no longer supported? Says who? I'm successfully using it in several projects. No worries, it will work just fine.

    Yes, you will be redrawing the EAGLView, but only before saving the sequence.
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
  • saransaran Posts: 10Registered Users
    edited August 2009
    Texture2D no longer supported? Says who? I'm successfully using it in several projects. No worries, it will work just fine.

    Yes, you will be redrawing the EAGLView, but only before saving the sequence.

    Hi sorry... i thought that Texture2D was an apple sdk class... so when i searched for the documentation... i was unable to find one... this lead me to a misunderstanding that it was no longer supported... thank you for pointing it out.

    it worked fine... except that the background was darker than the image i used... guess it is something to do with the blend function...

    i tried a couple...

    glBlendFunc(GL_SRC_ALPHA, GL_ONE);
    glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);

    din seem to work...

    any suggestions please
  • saransaran Posts: 10Registered Users
    edited August 2009
    saran wrote: »
    Hi sorry... i thought that Texture2D was an apple sdk class... so when i searched for the documentation... i was unable to find one... this lead me to a misunderstanding that it was no longer supported... thank you for pointing it out.

    it worked fine... except that the background was darker than the image i used... guess it is something to do with the blend function...

    i tried a couple...

    glBlendFunc(GL_SRC_ALPHA, GL_ONE);
    glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);

    din seem to work...

    any suggestions please

    sorry if my post was ambiguous.. what i meant was that the image which i used as the background turned out darker with higher contrast in the image saved in the photoalbum.

    the texture which i grabbed from the eaglview is fine though.

    Thanx,
    Saran.
  • saransaran Posts: 10Registered Users
    edited August 2009
    actually i was wrong...

    even the texture i grabbed from the eaglView on to the 'clean' texture was also reduced in brightness.

    Any idea why?

    Saran.
  • saransaran Posts: 10Registered Users
    edited August 2009
    - (CGImageRef) getPicture
    {
    glEnableClientState(GL_TEXTURE_COORD_ARRAY);

    // Create and init the textures
    Texture2D *_textures[2];
    _textures[0] = Texture2D alloc] initWithImage: [UIImage imageNamed:@"clean.png";
    _textures[1] = Texture2D alloc] initWithImage: [UIImage imageNamed:@"testbkgnd1.jpg";

    // Get the drawing in the screen
    glBindTexture(GL_TEXTURE_2D,[_textures[0] name]);
    glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, kScreenWidth, kScreenHeight);

    // Redraw the scene
    [_textures[1] drawInRect:[self bounds]];
    [_textures[0] drawInRect:[self bounds]];

    // Entire data length
    NSInteger myDataLength = kScreenWidth * kScreenHeight * 4;

    // allocate array and read pixels into it.
    static GLubyte *buffer = NULL;
    static GLubyte *invertedBuffer = NULL;

    if(buffer == NULL)
    buffer = (GLubyte *) malloc(myDataLength);

    if(invertedBuffer == NULL)
    invertedBuffer = (GLubyte *) malloc(myDataLength);

    glReadPixels(0, 0, kScreenWidth, kScreenHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // reverse the height as opengl draws it upside down
    for(int y = 0; y <kScreenHeight; y++)
    {
    for(int x = 0; x <kScreenWidth * 4; x++)
    {
    invertedBuffer[((kScreenHeight-1) - y) * kScreenWidth * 4 + x] = buffer[y * 4 * kScreenWidth + x];
    }
    }

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, invertedBuffer, myDataLength, NULL);

    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * kScreenWidth;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGImageRef imageRef = CGImageCreate(kScreenWidth, kScreenHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    // Redraw the drawings w/o the background
    [self erase];
    [_textures[0] drawInRect:[self bounds]];
    glDisableClientState(GL_TEXTURE_COORD_ARRAY);
    [self swapBuffers];


    // Bind the brush stoke texture to the default one
    glBindTexture(GL_TEXTURE_2D, brushTexture[0]);

    // Release the textures
    [ _textures[0] release];
    [ _textures[1] release];

    return imageRef;
    }


    This is my code... but still get a darker _texture[1] with higher contrast.
    and when i Redraw the drawings w/o the background the drawings were also dark w/o any additive drawings also...

    comments please.

    Saran.
  • wassxwassx Posts: 13Registered Users
    edited September 2009
    Hi!

    I'm not familiar with the GL stuff but i had a chance to do screenshots very easily with a code found on this forum:
    UIGraphicsBeginImageContext(self.frame.size);
    	[[COLOR="Red"]self.window.layer[/COLOR] renderInContext:UIGraphicsGetCurrentContext()];
    	UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
    	UIGraphicsEndImageContext();
    	
    	UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
    

    My question is, how can I render the drawings of the GLPaint example without the background, just the stroke.
    I implemented the code above in the touchEnded method in PaintingView and it works fine, but only renders the background with the segmented bar.

    Any ideas please?

    Edit: I tried various layers in the red marked code

    Edit 2: sry. had no idea whats happening in the first post. Now I have. Thank you for this great example.
  • anthony_lynch15anthony_lynch15 Posts: 1New Users
    edited November 2009
    Hey guys,

    Was having problems with this exact same issue.
    This thread got me started down the road to the solution.
    So I thought I'd share what I learned.

    I didn't actually work out myself, but put it together from various forum threads and this little blog post in which a post by a guy called gamekozo in the comment section gave me the final piece of the puzzle.

    This is the piece of code that takes the contents of the drawing view / painting view / EAGLView / GLView (whatever you want to call it), renders it.
    Then turns it into a UIImage and passes it back to whereever it came from.
    - (UIImage *) glToUIImage {
        NSInteger myDataLength = 320 * 480 * 4;
    	
        // allocate array and read pixels into it.
        GLubyte *buffer = (GLubyte *) malloc(myDataLength);
        glReadPixels(0, 0, 320, 480, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    	
        // gl renders "upside down" so swap top to bottom into new array.
        // there's gotta be a better way, but this works.
        GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
        for(int y = 0; y <480; y++)
        {
            for(int x = 0; x <320 * 4; x++)
            {
                buffer2[(479 - y) * 320 * 4 + x] = buffer[y * 4 * 320 + x];
            }
        }
    	
        // make data provider with data.
        CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
    	
        // prep the ingredients
        int bitsPerComponent = 8;
        int bitsPerPixel = 32;
        int bytesPerRow = 4 * 320;
        CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    	
       //xxxxxx This is the line of code that I found in multiple solutions throughout the web but doesn't deal with the transparency
       // CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
       //xxxxxx
    
       //*******This is the code I used to handle the tranparency!!!
       CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast;
       //*******               
    
        CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
    	
        // make the cgimage
        CGImageRef imageRef = CGImageCreate(320, 480, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
    	
        // then make the uiimage from that
        UIImage *myImage = [UIImage imageWithCGImage:imageRef];
        return myImage;
    }
    

    The vital piece of code for being able to save the transparency/alpha properties of the GLView is the line:
    CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast;

    You can use code like the following to save the image created from the GLView in the code above.
    Below I create an UIImage which I then put the image rendered into.
    Then save it to the Camera Roll / Photo Album
    UIImage *viewImage = [self glToUIImage];
    		
    //code to write the image to the Photo Album
    UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
    

    You can put the image into an ImageView over a background and then save the whole thing etc.

    Hope that helps people and saves them a day of searching.

    ~Anthony
  • shortciphershortcipher Posts: 6New Users
    edited March 2010
    Anthony, that's brilliant, works beautifully in my app!
  • bazookabazooka Posts: 1New Users
    edited August 2010
    i have two images which are overlapping on each other.(the way in which cards are placed on top of each other)

    now if i move my finger over the top most image that portion of the image should become transparent.(opacity of that part should become 0).

    i am new to OpenGL ES development.

    kindly help me out or give me any suggestion to complete this functionality.

    Thanks in advance
  • blackyblacky Posts: 8New Users
    edited October 2010
    @NewiPhoneDeveloper

    I am working on an app that uses some part of GLPaint code to paint over an image.

    I am new to iphone development and for now am working directly on the GLPaint app to show image.

    I do not have any UIImageView in the background I tried the your snippet but the screen is still blank (white). I think the init method clears everything to prepare for painting. I am not sure how to bring the image for painting.

    Please throw some light on this.

    Thanks in advance.
  • NewiPhoneDeveloperNewiPhoneDeveloper Posts: 459Registered Users
    edited October 2010
    blacky wrote: »
    @NewiPhoneDeveloper

    I am working on an app that uses some part of GLPaint code to paint over an image.

    I am new to iphone development and for now am working directly on the GLPaint app to show image.

    I do not have any UIImageView in the background I tried the your snippet but the screen is still blank (white). I think the init method clears everything to prepare for painting. I am not sure how to bring the image for painting.

    Please throw some light on this.

    Thanks in advance.

    So, you want to paint over some image, right? The easiest way to achieve this, is to simply set the "opaque" value of your eaglLayer to NO. Usually it's inside a method, named "- (BOOL) _createSurface". Look for the following line:
    eaglLayer.opaque = YES;
    
    and set it to
    eaglLayer.opaque = NO;
    

    Now your eaglLayer should be transparent and reveal whatever view you put behind it. This way you don't actually have to put the image, you want to paint over, into your eaglView. Instead you can simply have it inside some UIView, you add as subView behind your eaglView. Makes sense?

    In case I didn't get you right on this one, please post some code. Thanks.
    Websites:<br />
    <a href="http://www.friendlydeveloper.com"; target="_blank">Friendlydeveloper</a> - Coding Blog<br />
    <a href="http://www.codingsessions.com"; target="_blank">Codingsessions</a> - Live iOS Training<br />
    <br />
    iPhone Apps: <br />
    <a hr
«1
Sign In or Register to comment.