前言:
其實,不僅僅是水印,包含一些3D立體相冊的靈感。可以通過GPUImage來實作
試想一下,我們可以通過手機錄制視訊,然後能夠實作自己添加特效,制作成動感影集,是不是很激動。
實作效果:

說明:
實作了一個簡單地動畫,邏輯是,A視圖跟随B視圖轉動,但是A視圖的尺寸僅僅被B視圖包裹在内,随之變動。
其實很簡單:
imageView1.frame = CGRectMake(imageView2.frame.origin.x+, imageView2.frame.origin.y+, imageView2.frame.size.width+, imageView2.frame.size.height+);
imageView2.layer.transform = CATransform3DRotate(imageView2.layer.transform, M_PI/, , , );
一個簡單地CATransform3D動畫。
關于視訊水印,請看上一篇部落格
http://blog.csdn.net/xoxo_x/article/details/71055867
介紹一下:GPUImageUIElement
其建立方式和對象方法:
// Initialization and teardown
- (id)initWithView:(UIView *)inputView;
- (id)initWithLayer:(CALayer *)inputLayer;
// Layer management
- (CGSize)layerSizeInPixels;
- (void)update;
- (void)updateUsingCurrentTime;
- (void)updateWithTimestamp:(CMTime)frameTime;
注意:
我們需要擷取水印的動态,是以需要得到視訊的時間戳,是以需要用到updateWithTimestamp這個函數。
如果我們不用的話,我們可以通過一個定時器來更新,但是擷取目前時間的時間戳是麻煩的
監控視訊時間:
這也是一個處理視訊進度的一個回調。
GPUImageFilter* progressFilter = [[GPUImageFilter alloc] init];
[videoCamera addTarget:progressFilter];
[progressFilter addTarget:filter];
//達到擷取目前處理時間的時間戳的目的
[progressFilter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
imageView1.frame = CGRectMake(imageView2.frame.origin.x+, imageView2.frame.origin.y+, imageView2.frame.size.width+, imageView2.frame.size.height+);
imageView2.layer.transform = CATransform3DRotate(imageView2.layer.transform, M_PI/, , , );
[strongSelf->pictureView updateWithTimestamp:time];
}];
全部代碼:
//
// ViewController.m
// WatermarkDemo
//
// Created by 馮士魁 on //
// Copyright © 年 xoxo_x. All rights reserved.
//
/**
在這裡你會發現更多,好玩的事情
* http://blog.csdn.net/xoxo_x/article
*
*
*/
#import "ViewController.h"
#import "GPUImage.h"
@interface ViewController (){
// GPUImagePicture *pictureFile;
GPUImageOutput<GPUImageInput> *filter;
GPUImageVideoCamera *videoCamera;
GPUImageView *filterView;
GPUImageUIElement * pictureView;
GPUImageMovie * movieFile;
}
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
[self initGPUImageView];
[self initFilter];
[self initCamera];
UIView *view = [[UIView alloc]initWithFrame:self.view.bounds];
UIImageView *imageView1 = [[UIImageView alloc]initWithFrame:CGRectMake(, , self.view.bounds.size.width/, self.view.bounds.size.height/)];
imageView1.image = [UIImage imageNamed:@"美女1.jpg"];
[view addSubview:imageView1];
UIImageView *imageView2 = [[UIImageView alloc]initWithFrame:CGRectMake(self.view.bounds.size.width/, self.view.bounds.size.height/, self.view.bounds.size.width/, self.view.bounds.size.height/)];
imageView2.image = [UIImage imageNamed:@"美女2.jpg"];
[view addSubview:imageView2];
pictureView = [[GPUImageUIElement alloc]initWithView:view];
GPUImageFilter* progressFilter = [[GPUImageFilter alloc] init];
[videoCamera addTarget:progressFilter];
[progressFilter addTarget:filter];
[pictureView addTarget:filter];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
__strong typeof(self) strongSelf = self;
[progressFilter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
imageView1.frame = CGRectMake(imageView2.frame.origin.x+, imageView2.frame.origin.y+, imageView2.frame.size.width+, imageView2.frame.size.height+);
imageView2.layer.transform = CATransform3DRotate(imageView2.layer.transform, M_PI/, , , );
[strongSelf->pictureView updateWithTimestamp:time];
}];
}
-(void)initGPUImageView{
filterView = [[GPUImageView alloc] initWithFrame:self.view.frame];
[self.view addSubview:filterView];
}
-(void)initFilter{
filter = [[GPUImageAlphaBlendFilter alloc] init];
}
-(void)initCamera{
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
videoCamera.horizontallyMirrorFrontFacingCamera = YES;
}
@end
歡迎打賞 – 打賞後、加好友哦 O(∩_∩)O哈哈~