1. 程式人生 > >OpenGL ES 渲染和簡單的濾鏡效果

OpenGL ES 渲染和簡單的濾鏡效果

--- Update 12.25 ---

glPixelStorei

實際上OpenGL也支援使用了這種“對齊”方式的畫素資料。只要通過glPixelStore修改“畫素儲存時對齊的方式”就可以了。像這樣: int alignment = 4; glPixelStorei(GL_UNPACK_ALIGNMENT, alignment); 第一個引數表示“設定畫素的對齊值”,第二個引數表示實際設定為多少。這裡畫素可以單位元組對齊(實際上就是不使用對齊)、雙位元組對齊(如果長度為奇數,則再補一個位元組)、四位元組對齊(如果長度不是四的倍數,則補為四的倍數)、八位元組對齊。分別對應alignment的值為1, 2, 4, 8。實際上,預設的值是4,正好與BMP檔案的對齊方式相吻合。

--- Init Commit ---

我也是OpenGL ES初學者,文中可能有些概念或者流程理解錯誤的,還望指正。

總結寫的比較粗糙,閱讀需要有一定的OpenGL或者OpenGL ES基本知識

先附上效果圖:


之前做過OpenGL ES的工程,當時只是為了實現工程,把OpenGL的教程,按照步驟直接套用,沒有具體理清流程和簡單的原理。後面看了幾期葉孤城的從0打造GPUImage,覺得整理的非常好,於是順著他的思路又重新理了一遍OpenGL ES的內容。

目前就我個人理解程度,使用OpenGL的好處就是,充分利用裝置的GPU,減輕CPU的負載;同時可以從更底層的角度對影象進行處理和渲染展示(視訊也就是每一幀一幀的影象)。iOS的UIKit就是基於OpenGL ES封裝的(參見iOS開發,

檢視渲染與效能優化)。所以,瞭解OpenGL ES的基本工作原理,shader等的知識很有必要。

Apple本身就提供了GLKView這個類,就是基於OpenGL的View。《OpenGL+ES應用開發實踐指南:iOS卷 》從開篇就引導讀者使用GLKView,並且為了方便讀者理解裡面的使用原理,自己建立了AGLKContext,AGLKVertexAttribArrayBuffer等等類。其主要流程是:

1. 設定GLContext,上下文,初始化OpenGL API的版本,常用的是OpenGLES2;

    GLKView *view = (GLKView *)self.view;
    NSAssert([viewisKindOfClass:[GLKView class]],
             @"View controller's view is not a GLKView");
    
    view.context = [[EAGLContextalloc]
                    initWithAPI:kEAGLRenderingAPIOpenGLES2];
    
    [EAGLContextsetCurrentContext:view.context];

2. 設定GLKBaseEffect,關於OpenGL的一些效果圖,可以翻看Apple官方對於GLKBaseEffect的表述:“ GLKBaseEffect is designed to simplify visual effects common to many OpenGL applications today.  For iOS, GLKBaseEffect requires at least OpenGL ES 2.0 and for OS X, GLKBaseEffect requires at least an OpenGL Core Profile.”

其常用屬性如下所示:


以及一些初始化背景色之類的設定,    

	GLKVector4 clearColorRGBA = GLKVector4Make(1.0f, 1.0f, 1.0f, 1.0f);// RGBA
    	glClearColor(clearColorRGBA.r,clearColorRGBA.g,clearColorRGBA.b, clearColorRGBA.a);

主要是生成,繫結,快取,啟用,設定指標,繪製和刪除;

4.根據圖片或圖片資料等生成OpenGL紋理,進行渲染(這個比較傻瓜式,把生成的紋理ID和target都賦值給baseEffect,具體操作就不用管了);

在GLKView裡面,會自動週期性呼叫這個方法

- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect;

所以,第5不繪製放到這個方法裡面

    // Clear back frame buffer (erase previous drawing)
    glClear(GL_COLOR_BUFFER_BIT);
    
    glBindBuffer(GL_ARRAY_BUFFER,_Buffer);// STEP 2
    glEnableVertexAttribArray(GLKVertexAttribPosition);// Step 4
    // Step 5
    glVertexAttribPointer(GLKVertexAttribPosition,               // Identifies the attribute to use
                          3,               // number of coordinates for attribute
                          GL_FLOAT,            // data is floating point
                          GL_FALSE,            // no fixed point scaling
                          sizeof(SceneVertex),         // total num bytes stored per vertex
                          NULL + offsetof(SceneVertex, positionCoords));      // offset from start of each vertex to
                                               // first coord for attribute
    
    glBindBuffer(GL_ARRAY_BUFFER, _Buffer);// STEP 2
    glEnableVertexAttribArray(GLKVertexAttribTexCoord0);// Step 4
    // Step 5
    glVertexAttribPointer(GLKVertexAttribTexCoord0,               // Identifies the attribute to use
                          2,               // number of coordinates for attribute
                          GL_FLOAT,            // data is floating point
                          GL_FALSE,            // no fixed point scaling
                          sizeof(SceneVertex),         // total num bytes stored per vertex
                          NULL + offsetof(SceneVertex, textureCoords));      // offset from start of each vertex to first coord for attribute
    
    // Step 6
    // Draw triangles using the first three vertices in the
    // currently bound vertex buffer
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);  

另外就是吳彥祖(葉孤城)總結的帶shader OpenGL ES使用,他沒有用到GLK相關的類。所以開始有些東西需要自己去設定和指定:

1. 設定OpenGL context,layer層繪製屬性,並新增到view的layer層上

    /***  設定上下文   ***/
    _eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; //opengl es 2.0
    [EAGLContext setCurrentContext:_eaglContext]; //設定為當前上下文。
    
    /***  新增layer層   ***/
    _eaglLayer = [CAEAGLLayer layer];
    _eaglLayer.frame = self.view.bounds;
    _eaglLayer.backgroundColor = [UIColor yellowColor].CGColor;
    _eaglLayer.opaque = YES;
    
    _eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:NO],kEAGLDrawablePropertyRetainedBacking,kEAGLColorFormatRGBA8,kEAGLDrawablePropertyColorFormat, nil];
    [self.view.layer addSublayer:_eaglLayer];

關於drawableProperties有定義解釋:

/************************************************************************/
/* Keys for EAGLDrawable drawableProperties dictionary                  */
/*                                                                      */
/* kEAGLDrawablePropertyRetainedBacking:                                */
/*  Type: NSNumber (boolean)                                            */
/*  Legal Values: True/False                                            */
/*  Default Value: False                                                */
/*  Description: True if EAGLDrawable contents are retained after a     */
/*               call to presentRenderbuffer.  False, if they are not   */
/*                                                                      */
/* kEAGLDrawablePropertyColorFormat:                                    */
/*  Type: NSString                                                      */
/*  Legal Values: kEAGLColorFormat*                                     */
/*  Default Value: kEAGLColorFormatRGBA8                                */
/*  Description: Format of pixels in renderbuffer                       */
/************************************************************************/


2. 設定幀快取和渲染快取
    /***  清除幀快取和渲染快取   ***/
    if (_renderBuffer) {
        glDeleteRenderbuffers(1, &_renderBuffer);
        _renderBuffer = 0;
    }
    
    if (_frameBuffer) {
        glDeleteFramebuffers(1, &_frameBuffer);
        _frameBuffer = 0;
    }
    
    /***  設定幀快取和渲染快取   ***/
    glGenFramebuffers(1, &_frameBuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, _frameBuffer);
    
    glGenRenderbuffers(1, &_renderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);
    
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _renderBuffer);
    /* Attaches an EAGLDrawable as storage for the OpenGL ES renderbuffer object bound to <target> */
    [_eaglContext renderbufferStorage:GL_RENDERBUFFER fromDrawable:_eaglLayer];
    
    GLint width = 0;
    GLint height = 0;
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width);
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height);
    //check success
    if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) {
        NSLog(@"Failed to make complete framebuffer object: %i", glCheckFramebufferStatus(GL_FRAMEBUFFER));
    }

3. 設定視角和背景顏色
    glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);
    glViewport(0, 0, self.view.bounds.size.width, self.view.bounds.size.height);

4. compile shader,編譯shader檔案
- (void)setShader{
    GLuint vertexShaderName = [self compileShader:@"vertexShader.vsh" withType:GL_VERTEX_SHADER];
//    GLuint fragmenShaderName = [self compileShader:@"fragmentShader.fsh" withType:GL_FRAGMENT_SHADER];
    GLuint fragmenShaderName = [self compileShader:@"luminance.fsh" withType:GL_FRAGMENT_SHADER];
    
    _programHandle = glCreateProgram();
    glAttachShader(_programHandle, vertexShaderName);
    glAttachShader(_programHandle, fragmenShaderName);
    
    glLinkProgram(_programHandle);
    
    GLint linkSuccess;
    glGetProgramiv(_programHandle, GL_LINK_STATUS, &linkSuccess);
    if (linkSuccess == GL_FALSE) {
        GLchar messages[256];
        glGetProgramInfoLog(_programHandle, sizeof(messages), 0, &messages[0]);
        NSString *messageString = [NSString stringWithUTF8String:messages];
        NSLog(@"%@", messageString);
        exit(1);
    }
    
    _positionSlot = glGetAttribLocation(_programHandle,[@"in_Position" UTF8String]);
    _textureSlot = glGetUniformLocation(_programHandle, [@"in_Texture" UTF8String]);
    _textureCoordSlot = glGetAttribLocation(_programHandle, [@"in_TexCoord" UTF8String]);
    _colorSlot = glGetAttribLocation(_programHandle, [@"in_Color" UTF8String]);
    _Saturation_brightness = glGetAttribLocation(_programHandle, [@"in_Saturation_Brightness" UTF8String]);
    _enableGrayScale = glGetAttribLocation(_programHandle, [@"in_greyScale" UTF8String]);
    _enableNegation = glGetAttribLocation(_programHandle, [@"in_negation" UTF8String]);
    
    glUseProgram(_programHandle);
}

- (GLuint)compileShader:(NSString *)shaderName withType:(GLenum)shaderType {
    NSString *path = [[NSBundle mainBundle] pathForResource:shaderName ofType:nil];
    NSError *error = nil;
    NSString *shaderString = [NSString stringWithContentsOfFile:path encoding:NSUTF8StringEncoding error:&error];
    if (!shaderString) {
        NSLog(@"%@", error.localizedDescription);
    }
    
    const char * shaderUTF8 = [shaderString UTF8String];
    GLint shaderLength = (GLint)[shaderString length];
    GLuint shaderHandle = glCreateShader(shaderType);
    glShaderSource(shaderHandle, 1, &shaderUTF8, &shaderLength);
    glCompileShader(shaderHandle);
    
    GLint compileSuccess;
    glGetShaderiv(shaderHandle, GL_COMPILE_STATUS, &compileSuccess);
    if (compileSuccess == GL_FALSE) {
        GLchar message[256];
        glGetShaderInfoLog(shaderHandle, sizeof(message), 0, &message[0]);
        NSString *messageString = [NSString stringWithUTF8String:message];
        NSLog(@"%@", messageString);
        exit(1);
    }
    return shaderHandle;
}

這一段是參考吳彥祖的shader編譯,其實網上的都大同小異:

搜尋工程中shader檔案,以字串的方式讀取出來,轉換成UTF8String,然後交由OpenGL編譯

    _positionSlot = glGetAttribLocation(_programHandle,[@"in_Position" UTF8String]);
    _textureSlot = glGetUniformLocation(_programHandle, [@"in_Texture" UTF8String]);
    _textureCoordSlot = glGetAttribLocation(_programHandle, [@"in_TexCoord" UTF8String]);
    _colorSlot = glGetAttribLocation(_programHandle, [@"in_Color" UTF8String]);
    _Saturation_brightness = glGetAttribLocation(_programHandle, [@"in_Saturation_Brightness" UTF8String]);
    _enableGrayScale = glGetAttribLocation(_programHandle, [@"in_greyScale" UTF8String]);
    _enableNegation = glGetAttribLocation(_programHandle, [@"in_negation" UTF8String]);

這幾個都是shader的入參:

_positionSlot是vertex position

_textureCoordSlot是對應的texture座標

_textureSlot是傳入的紋理

_colorSlot為片元顏色

_Saturation_brightness 是我自己新增的引數,用於控制顏色飽和度

_enableGrayScale 自定義引數,灰度開關(值為0或1)

_enableNegation 自定義引數,取反開關(0或1,效果類似於相機底片)

5. 設定紋理

- (void)setTexture{
    glDeleteTextures(1, &texName);
    
    /***  Generate Texture   ***/
    texName = [self getTextureFromImage:[UIImage imageNamed:picName]];
    
    /***  Bind Texture   ***/
    glActiveTexture(GL_TEXTURE1);
    glBindTexture(GL_TEXTURE_2D, texName);
    glUniform1i(_textureSlot, 1);
}

- (GLuint)getTextureFromImage:(UIImage *)image {
    CGImageRef imageRef = [image CGImage];
    size_t width = CGImageGetWidth(imageRef);
    size_t height = CGImageGetHeight(imageRef);
    GLubyte* textureData = (GLubyte *)malloc(width * height * 4);
    
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(textureData, width, height,
                                                 bitsPerComponent, bytesPerRow, colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGContextTranslateCTM(context, 0, height);
    CGContextScaleCTM(context, 1.0f, -1.0f);
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    
    glEnable(GL_TEXTURE_2D);
    GLuint texName;
    glGenTextures(1, &texName);
    glBindTexture(GL_TEXTURE_2D, texName);
    
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (GLsizei)width, (GLsizei)height, 0, GL_RGBA, GL_UNSIGNED_BYTE, textureData);
    glBindTexture(GL_TEXTURE_2D, 0);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    free(textureData);
    return texName;
}
相比於第一種方法使用的GLKTextureLoader,這裡相當於GLKTextureLoader背後的具體實現,建立影象色彩空間是為了對影象進行重繪,由於iOS中座標系的原因,直接繪製上的影象是上下左右顛倒的(可以註釋掉CGContextSacaleCTM和CGContextTranslateCTM看看最終的渲染效果)。

與快取類似,首先是啟用紋理,然後產生紋理,繫結紋理,指定紋理textureData。

6. 繪製

- (void)drawTrangle {
    UIImage *image = [UIImage imageNamed:picName];
    CGRect realRect = AVMakeRectWithAspectRatioInsideRect(image.size, self.view.bounds);
    CGFloat widthRatio = realRect.size.width/self.view.bounds.size.width;
    CGFloat heightRatio = realRect.size.height/self.view.bounds.size.height;
    
    //    const GLfloat vertices[] = {
    //        -1, -1, 0,   //左下
    //        1,  -1, 0,   //右下
    //        -1, 1,  0,   //左上
    //        1,  1,  0 }; //右上
    const GLfloat vertices[] = {
        -widthRatio, -heightRatio, 0,   //左下
        widthRatio,  -heightRatio, 0,   //右下
        -widthRatio, heightRatio,  0,   //左上
        widthRatio,  heightRatio,  0 }; //右上
    glEnableVertexAttribArray(_positionSlot);
    glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, GL_FALSE, 0, vertices);
    
    // normal
    static const GLfloat coords[] = {
        0, 0,
        1, 0,
        0, 1,
        1, 1
    };
    
    glEnableVertexAttribArray(_textureCoordSlot);
    glVertexAttribPointer(_textureCoordSlot, 2, GL_FLOAT, GL_FALSE, 0, coords);
    
    static const GLfloat colors[] = {
        1, 0, 0, 1,
        0, 0, 0, 1,
        0, 0, 0, 1,
        1, 0, 0, 1
    };
    
    glEnableVertexAttribArray(_colorSlot);
    glVertexAttribPointer(_colorSlot, 4, GL_FLOAT, GL_FALSE, 0, colors);
    
    //亮度,色度
    GLfloat saturation_brightness[] = {
        saturationPara, brightnessPara,
        saturationPara, brightnessPara,
        saturationPara, brightnessPara,
        saturationPara, brightnessPara
    };
    glEnableVertexAttribArray(_Saturation_brightness);
    glVertexAttribPointer(_Saturation_brightness, 2, GL_FLOAT, GL_FALSE, 0, saturation_brightness);
    
    //灰度圖
    GLfloat grayScale[] = {
        grayScalePara,
        grayScalePara,
        grayScalePara,
        grayScalePara
    };
    glEnableVertexAttribArray(_enableGrayScale);
    glVertexAttribPointer(_enableGrayScale, 1, GL_FLOAT, GL_FALSE, 0, grayScale);
    
    //取反
    GLfloat negation[] = {
        negationPara,
        negationPara,
        negationPara,
        negationPara
    };
    glEnableVertexAttribArray(_enableNegation);
    glVertexAttribPointer(_enableNegation, 1, GL_FLOAT, GL_FALSE, 0, negation);
    
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    [_eaglContext presentRenderbuffer:GL_RENDERBUFFER];
}
繪製時,基本上按照編譯shader裡面的引數來傳入,每個引數首先是enable vertexAttribute,然後新增指標。

常見的OpenGL頂點座標應該是 :

-1, -1, 0,   //左下

 1,  -1, 0,   //右下

-1, 1,  0,   //左上

1,  1,  0 }; //右上

但是你不能保證繪製的圖片和iPhone螢幕的長寬比是一直的,所以,一般圖片會進行縮放來適配螢幕。使用AVMakeRectWithAspectRatioInsideRect函式可以返回圖片在螢幕內最大的rect,具體description:

/*!
 @function					AVMakeRectWithAspectRatioInsideRect
 @abstract					Returns a scaled CGRect that maintains the aspect ratio specified by a CGSize within a bounding CGRect.
 @discussion				This is useful when attempting to fit the presentationSize property of an AVPlayerItem within the bounds of another CALayer. 
							You would typically use the return value of this function as an AVPlayerLayer frame property value. For example:
							myPlayerLayer.frame = AVMakeRectWithAspectRatioInsideRect(myPlayerItem.presentationSize, mySuperLayer.bounds);
 @param aspectRatio			The width & height ratio, or aspect, you wish to maintain.
 @param	boundingRect		The bounding CGRect you wish to fit into. 
 */

關於glDrawArrays (GLenum mode, GLint first, GLsizei count);,繪製時會有不同的繪製模式。其實OpenGL就是繪製三角面片,每個三角面片是繪製的最基本單位。甚至三維立體模型,看著光滑的球面,也是由無數三角面片,不斷細分,最終趨近光滑。

所以一般都是每3個點一組來繪製三角面片,iPhone矩形的螢幕可以由兩個三角面片組成。具體的繪製mode,可以參考OpenGL基本圖元轉換為GL_TRIANGLES

關於shader程式設計

感覺shader程式設計比較坑,對變數,資料型別要求都很嚴格,必須一一對應,所以,寫shader的時候要格外小心。

demo裡面還寫了關於實時處理攝像頭採集的資料,用OpenGL渲染到view上,具體流程跟1和2差不多,有三點需要注意:

1.攝像頭採集的影象預設是逆時針旋轉90°的,但是AVCaptureVideoPreviewLayer層是正常的,懷疑Apple在自己悶著處理了。如果需要自己用OpenGL渲染的話,需要將影象順時針旋轉90°。具體解決方法可以在shader裡面對紋理進行調整,見demo;

2.capture的output有個videoSettings屬性,可以設定輸出視訊的格式。翻看前文,可以知道,OpenGL的glTexImage2D支援GL_RGBA,但是不支援GL_BGRA。但是Apple的video輸出可以選擇kCVPixelFormatType_32BGRA和kCVPixelFormatType_32RGBA。

不要高興的太早了,kCVPixelFormatType_32RGBA有一部分裝置不支援。所以shader裡面要支援BGRA轉RGBA這種,見demo;

3.使用OpenGL的時候一定要注意記憶體問題,不需要的渲染快取和紋理要刪除,不然會導致記憶體暴漲;

關於亮度,飽和度,灰度圖,涉及到影象相關

灰度圖就是將每個畫素的RGB值,按照一定權值進行計算,然後分別賦給RGB所得到的;

亮度最簡單的方法就是在RGB三個顏色值上加相同的亮度值,RGB的取值範圍0~1,超過部分置1;

飽和度,根據圖片的RGB資訊,生成了一個灰度的greyScaleColor。最後根據使用者輸入的saturation的值,調整顏色。因為mix(greyScaleColor, textureColor.rgb, saturation) = greyScaleColor * (1-saturation ) + textureColor.rgb * saturation

取反,是就取出每個畫素點的RGB值,然後用1 - R = R,類似得到G值和B值,得到效果類似於相機底片