So, what’s QR code? I believe most of you know what a QR code is. In case you haven’t heard of it, just take a look at the image below. That’s QR code.
QR (short for Quick Response) code is a kind of 2-dimensional bar code developed by Denso. Originally designed for tracking parts in manufacturing, QR code has gained popularity in consumer space in recent years as a way to encode URL of a landing page or marketing information. Unlike the basic barcode that you’re familiar with, QR code contains information in both horizontal and vertical direction. Thus this contributes to its capability of storing larger amount of data in both numeric and letter form. Here I don’t want to go into the technical details of QR code. If you’re interested, you can check out the official website of QR code to learn more.
In recent years, the use of QR code has been on the rise. It appears in magazines, newspapers, advertisement, billboards and even name cards. As an iOS developer, you may wonder how to empower your app to read QR code. Some time earlier, Gabriel wrote a great tutorial on QR code. In this tutorial, we will build a similar QR code reader app but in Swift. After going through the tutorial, you will understand how to use the AVFoundation framework to discover and read QR code in real-time.
Let’s get started.
Demo App
The demo app that we’re going to build is fairly simple and straightforward. Before I proceed to discuss the demo app, it’s important to understand that any barcode scanning in iOS including QR code is totally based on video capturing. That’s why the barcode scanning feature is added in the AVFoundation framework. Keep this point in mind as it’ll help you understand the entire chapter.
So, how does the demo app work?
Take a look at the screenshot below. This is how the app UI looks like. The app works pretty much like a video capturing app but without recording feature. When the app is launched, it takes advantage of the iPhone’s rear camera to spot the QR code and recognizes it automatically. The decoded information (e.g. an URL) is displayed right at the bottom of the screen.
To build the app, you can start by downloading the project template from here. I have pre-built the storyboard and link up the message label for you.
Using AVFoundation Framework
I have created the user interface of the app in the project template. The label in the UI will be used to display decoded information of QR code and it is associated with the messageLabel property of the ViewController class.
As mentioned, we rely on the AVFoundation framework to implement the QR code scanning feature. First, open ViewController.swift and import the framework:
1 | import AVFoundation |
Later, we’ll need to implement the AVCaptureMetadataOutputObjectsDelegate protocol. We’ll talk about that in a while. For now, update the following line of code:
1 | class ViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate |
Before moving on, declare the following variables in ViewController class. We’ll talk about them one by one later.
1 2 3 | var captureSession:AVCaptureSession? var videoPreviewLayer:AVCaptureVideoPreviewLayer? var qrCodeFrameView:UIView? |
Implementing video capture
As mentioned in the earlier section, QR code reading is totally based on video capture. To perform a real-time capture, all we need to do is instantiate an AVCaptureSession object with input set to the appropriate AVCaptureDevice for video capturing. Insert the following code in the viewDidLoad method of ViewController class:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | // Get an instance of the AVCaptureDevice class to initialize a device object and provide the video // as the media type parameter. let captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) // Get an instance of the AVCaptureDeviceInput class using the previous device object. var error:NSError? let input: AnyObject! = AVCaptureDeviceInput.deviceInputWithDevice(captureDevice, error: &error) if (error != nil) { // If any error occurs, simply log the description of it and don't continue any more. println("\(error?.localizedDescription)") return } // Initialize the captureSession object. captureSession = AVCaptureSession() // Set the input device on the capture session. captureSession?.addInput(input as AVCaptureInput) |
An AVCaptureDevice object represents a physical capture device. You use a capture device to configure the properties of the underlying hardware. As we are going to capture video data, we call the defaultDeviceWithMediaType method with AVMediaTypeVideo to get the video capture device. To perform a real-time capture, we instantiate an AVCaptureSession object and add the input of the video capture device. The AVCaptureSession object is used to coordinate the flow of data from the video input device to outputs.
In this case, the output of the session is set to an AVCaptureMetaDataOutput object. The AVCaptureMetaDataOutput class is the core part of QR code reading. This class in combination with theAVCaptureMetadataOutputObjectsDelegate protocol is used to intercept any metadata found in the input device (that is the QR code captured by the device’s camera) and translate it to a human readable format. Don’t worry if something sounds weird or if you don’t totally understand right now. Everything will become clear in a while. For now, continue to add the following lines of code in the viewDidLoad method:
1 2 3 | // Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session. let captureMetadataOutput = AVCaptureMetadataOutput() captureSession?.addOutput(captureMetadataOutput) |
Next, proceed to add the below lines of code. We set self as the delegate of the captureMetadataOutput object. This is the reason why the QRReaderViewController class adopts the AVCaptureMetadataOutputObjectsDelegate protocol. When new metadata objects are captured, they are forwarded to the delegate object for further processing. Here we also need to specify the dispatch queue on which to execute the delegate’s methods. According to Apple’s documentation, the queue must be a serial queue. So we simply use dispatch_get_main_queue() function to get the default serial queue.
1 2 3 | // Set delegate and use the default dispatch queue to execute the call back captureMetadataOutput.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue()) captureMetadataOutput.metadataObjectTypes = [AVMetadataObjectTypeQRCode] |
The metadataObjectTypes property is also quite important, as this is the point where we tell the app what kind of metadata we are interested in. The AVMetadataObjectTypeQRCode clearly indicates our purpose.
Now that we have set and configured a AVCaptureMetadataOutput object, we need to display the video captured by the device’s camera on screen. This can be done using a AVCaptureVideoPreviewLayer, which actually is a CALayer. You use this preview layer in conjunction with an AV capture session to display video and the preview layer is added as a sublayer of the current view. Here is the code:
1 2 3 4 5 | // Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer. videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill videoPreviewLayer?.frame = view.layer.bounds view.layer.addSublayer(videoPreviewLayer) |
Finally, we start the video capture by calling the startRunning method of the capture session:
1 2 | // Start video capture. captureSession?.startRunning() |
If you compile and run the app, it should start capturing video when launched. But wait, it seems the message label is hidden.
You can fix it by adding the following line of code:
1 2 | // Move the message label to the top view view.bringSubviewToFront(messageLabel) |
Re-run the app after changes. The message label “No QR code is detected” should now appear on screen.
Implementing QR Code Reading
As of now the app looks pretty much like a video capture app. So how can it scan QR code and translate the code into something meaningful? The app itself is already capable to detect QR code. We just don’t aware of that, however. Here are what we are going to tweak the app:
1. When a QR code is detected, the app will highlight the code using a green box.
2. The QR code will be decoded and the decoded information will be displayed at the bottom of the screen.
Initializing the Green Box
In order to highlight the QR code, we’ll first create an UIView object and set its border to green. Add the following code in the viewDidLoad method:
1 2 3 4 5 6 | // Initialize QR Code Frame to highlight the QR code qrCodeFrameView = UIView() qrCodeFrameView?.layer.borderColor = UIColor.greenColor().CGColor qrCodeFrameView?.layer.borderWidth = 2 view.addSubview(qrCodeFrameView!) view.bringSubviewToFront(qrCodeFrameView!) |
The UIView is invisible on screen as the size of the qrCodeFrameView object is set to zero by default. Later, when a QR code is detected, we will change its size and turn it into a green box.
Decoding the QR Code
As mentioned earlier, when the AVCaptureMetadataOutput object recognizes a QR code, the following delegate method of AVCaptureMetadataOutputObjectsDelegate will be called:
1 | optional func captureOutput(_ captureOutput: AVCaptureOutput!,
didOutputMetadataObjects metadataObjects: [AnyObject]!,
fromConnection connection: AVCaptureConnection!) |
So far we haven’t implemented the method and this is why the app can’t translate the QR code. In order to capture the QR code and decode the information, we need to implement the method to perform additional processing on metadata objects. Here is the code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) { // Check if the metadataObjects array is not nil and it contains at least one object. if metadataObjects == nil || metadataObjects.count == 0 { qrCodeFrameView?.frame = CGRectZero messageLabel.text = "No QR code is detected" return } // Get the metadata object. let metadataObj = metadataObjects[0] as AVMetadataMachineReadableCodeObject if metadataObj.type == AVMetadataObjectTypeQRCode { // If the found metadata is equal to the QR code metadata then update the status label's text and set the bounds let barCodeObject = videoPreviewLayer?.transformedMetadataObjectForMetadataObject(metadataObj as AVMetadataMachineReadableCodeObject) as AVMetadataMachineReadableCodeObject qrCodeFrameView?.frame = barCodeObject.bounds; if metadataObj.stringValue != nil { messageLabel.text = metadataObj.stringValue } } } |
The second parameter (i.e. metadataObjects) of the method is an Array object, which contains all the metadata objects have been read. The very first thing is to make sure that this array is not nil and of course, if it contains even one object. Otherwise, we reset the size of qrCodeFrameView to zero and set the messageLabel to its default message.
If a metadata object is found, we check to see if it is a QR code. If that’s the case, we’ll proceed to find out the bounds of the QR code.
These couple lines of code are used to set up the green box for highlighting the QR code. By calling the transformedMetadataObjectForMetadataObject method of viewPreviewLayer, the metadata object’s visual properties is converted to layer coordinates. From that, we can find out the bounds of QR code for constructing the green box.
1 2 3 | let barCodeObject = videoPreviewLayer?.transformedMetadataObjectForMetadataObject(metadataObj as AVMetadataMachineReadableCodeObject) as AVMetadataMachineReadableCodeObject qrCodeFrameView?.frame = barCodeObject.bounds |
Lastly, we decode the QR code into human readable information. It’s fairly simple. The decoded information can be accessed by using the stringValue property of an AVMetadataMachineReadableCodeObject.
Now you’re ready to go! Hit the Run button to compile and run the app on a real device. Once launched, point it to a QR code like the one on your left. The app immediately detects the code and decode the information.
Summary
With AVFoundation framework, creating a QR code reader app has never been so easy. Other than QR code, the AVFoundation framework supports various barcodes such as Code 39, Code 128, Aztec and PDF417. Try to tweak the existing Xcode project and enable the demo to scan other types of barcode.
For your reference, you can download the complete Xcode project from here.
If you like this tutorial, you’ll probably like our new Swift programming book. Check it out to learn more.
Không có nhận xét nào:
Đăng nhận xét