ART-UP: A Novel Method for Generating Scanning-robust Aesthetic QR codes

ART-UP: A Novel Method for Generating Scanning-robust Aesthetic QR codes

Mingliang Xu,   Qingfeng Li,   Jianwei Niu,   Xiting Liu, Weiwei Xu,   Pei Lv,   and Bing Zhou, 
Abstract

QR codes are usually scanned in different environments, so they must be robust to variations in illumination, scale, coverage, and camera angles. Aesthetic QR codes improve the visual quality, but subtle changes in their appearance may cause scanning failure. In this paper, a new method to generate scanning-robust aesthetic QR codes is proposed, which is based on a module-based scanning probability estimation model that can effectively balance the tradeoff between visual quality and scanning robustness. Our method locally adjusts the luminance of each module by estimating the probability of successful sampling. The approach adopts the hierarchical, coarse-to-fine strategy to enhance the visual quality of aesthetic QR codes, which sequentially generate the following three codes: a binary aesthetic QR code, a grayscale aesthetic QR code, and the final color aesthetic QR code. Our approach also can be used to create QR codes with different visual styles by adjusting some initialization parameters. User surveys and decoding experiments were adopted for evaluating our method compared with state-of-the-art algorithms, which indicates that the proposed approach has excellent performance in terms of both visual quality and scanning robustness.

Aesthetic QR codes, error analysis, visualization optimization, scanning robustness, scanning probability calculation.
{justify}

1 Introduction

A quick response (QR) code is a matrix symbology consisting of an array of nominally square modules. The popularization of smartphones has brought about wide use of QR codes in connection with offline and online life. These codes offer the advantages of large information capacity, low cost, and easy manufacture, etc[1]. At the same time, their appearance, consisting of black and white square modules, is difficult to meet the individualized demands of customers because of lack of aesthetic elements.

The process of embellishing QR codes always aims to improve the visual quality of their appearance, which makes them more interesting and more appealing[2, 3, 4, 5, 6, 7]. They can incorporate high-level semantic features such as faces, letters or logos into products of plain design and contribute to brand promotion. However, changing the appearance of QR codes manually is costly and difficult to achieve, as designers have to repeatedly confirm that the result can be recognized by general QR scanners. Consequently, changing in an automatic way is desirable, for the purpose of efficiency and low additional design cost.

The main challenge in embellishing QR codes is to ensure that the original information is not influenced and can be scanned by general scanners correctly even when the code’s appearance has been changed. Scanning robustness can be defined as the property describing whether the embellished QR code is easy to be scanned correctly by universal scanners. When QR codes are scanned in real-life applications, good scanning robustness is especially important, as it reduces the impact of various environmental factors, such as illumination, noises, coverage, and camera angles, etc, during the scanning process.

Traditional QR code images express information via modules using two highly contrasting colors. While the Reed-Solomon algorithm [8] is used to enhance fault tolerance and maintain the scanning robustness[1]. On the other hand, embellished aesthetic QR codes enhance visual quality and contain visual information by introducing other colors, changing module shapes, embedding icons, or adjusting codewords. For example, some representative results of aesthetic QR codes are shown in Fig. 1.

Fig. 1: Various types of aesthetic QR codes

In this work, we propose ART-UP codes - scAnning-Robust aesThetic QR (qUick resPonse) codes. We establish a new module-based scanning probability estimation model to measure the scanning robustness of aesthetic QR codes, resulting in QR codes with the same error tolerance but improved embellishment. More specifically, we first analyze the steps and fundamental reasons that cause errors during QR code scanning in accordance with QR decoding principles. Scanning errors are classified as either thresholding errors or sampling errors. Then, error models are accordingly established, with corresponding estimations of the probability of correctly sampling each module.

We then use an iterative luminance adjustment solution to locally adjust the luminance of each module by combining these probability estimation models, so as to increase the probability of correctly sampling and to improve scanning robustness. In addition, in order to obtain better visual quality and make the modification more effective, we combine the image saliency with the corresponding probability constraint, which generates QR codes that are scanning-robust and well embellished.

The major contributions of this work are as follows:

  • The scanning and decoding process of QR codes is analyzed, threshold and sampling errors that affect the robustness of embellished QR codes are quantified, and a module-based scanning probability estimation model is built in this paper, which is used to estimate the scanning robustness of the QR code.

  • An optimization strategy based on local luminance adjustment algorithm is proposed, which can balance the tradeoff between visual quality and scanning robustness. And a new method of generating aesthetic QR codes is established, which can achieve state-of-the-art visual quality and preserve error correction abilities.

  • An interesting threshold estimation algorithm is proposed, which is flexible enough to generate aesthetic QR code images with different styles by adjusting initialization parameters.

  • An easy to implement and effective luminance adjustment algorithm based on linear interpolation is proposed, which is fast, accurate. This algorithm can be applied to convert a grayscale aesthetic QR code into a color one.

2 Related Work

QR codes were first invented for tracking vehicles and parts during the manufacturing process, recently with the rapid development of mobile Internet, these have been widely applied to many different fields. This has resulted in an increased research interest in the technology necessary for embellishing QR codes[3, 2, 4, 9, 10, 5, 11, 12, 13, 14, 15, 16].

Currently, the existing foundation algorithms of generating embellished QR codes can be classified into four groups: embedding icons[17], replacing colors[18], changing the module shape[3, 14], and adjusting codewords[2]. Among the four ways, the one that embedding icons is the easiest to implement, but it relies on the inherent error correction capacity of the QR code, so its controllable area is relatively small and monotone, and reduces the error correction ability of the algorithm[17, 11]. Replacing colors and changing the module shape for attaching semantic information often need manual intervention and difficult to handle.[14]. Generating QR codes by adjusting codewords usually achieves scanning robustness, but the resulting visual quality is poor and the result is usually inconsistent[2].

In recent studies, researchers usually try to mix QR code modules with arbitrary input image in order to obtain a general method. For example, Cox[2] looked into the principle of QR code encoding and, by combining the characteristics of Reed-Solomon code with Gauss-Jordan elimination algorithm, proposed a complex method which adjusts the codewords of QR code in the encoding procedure, making the resulting image similar to the binary image. However, this method is fit for encoding URL data only. Through constant optimization, this algorithm has become the most efficient algorithm for adjusting codewords.

Apart from this, in 2013 Chu et al.[3] proposed Halftone QR Codes, combining the halftone algorithm with embellished QR codes. By separating each module into 3-by-3 sub-modules and binding the module color to the central sub-module color, a replacement algorithm of was proposed, which offered reliability and regularization in generating halftone QR codes. However, the algorithm adopts a nonlinear optimization procedure which suffers from inefficiency and low-quality visual result issues.

Subsequently, Garateguy et al.[9] proposed a new algorithm using halftone, which was based on the selection of a set of modified pixels using a halftoning mask and a probabilistic model predicting the distortion generated by the embedded image. Although it can enhance the visual quality by sacrificing some scanning robustness, the generated images still contain much image noise.

To further improve the visual quality of aesthetic QR codes, Yu-Hsun et al.[4] and Shih-Syun et al.[7] proposed two new algorithms. Yu-Hsun et al. took the saliency the embedded image into consideration so that pixels are modified more effectively, which offers a remarkable improvement in the visual quality of high version QR codes, but the effect of improving low version codes was limited. On the other hand, Shih-Syun et al. proposed an easy and efficient method, where the QR code module sequence was first adjusted to maintain its global similarity with the input image based on the Gauss-Jordan elimination procedure, and then a rendering mechanism was designed to blend the input image into the QR code according to a weight map for a better visual quality. Although the visual effect was quite good, it severely degraded the scanning robustness.

Apart from constantly improving the visual effect of QR codes, maintaining scanning robustness is also a challenge that researchers need to take into consideration. Alva et al. [19] proposed a commercial algorithm (Visualead) for generating scanning-robust embellished QR codes, but its implementation is closed and its appearance always gives people a feeling of clutter. On the other hand, Zhang et al. [5] came up with an aesthetic QR code generation algorithm based on two-stage image blending, as well as module-based and pixel-based blending. This resulted in a relatively robust embellished QR code generating algorithm, however, this algorithm resorts to empirical rather than theoretical analysis, which results in some images being hard to scan correctly for unexplained reasons, making the algorithm unsuitable for general use.

In this paper, we analyze theoretically and guarantee the scanning robustness of QR codes by establishing an error analysis model to estimate the probability of a single module being scanned properly. Meanwhile, by designing an adaptive parametric method, visually appealing and highly robust QR codes are obtained.

3 Qr Code Scanning

In order to generate an embellished QR code which is scanning robustness, we must analyze the details of the whole scanning and decoding process, and determine the error-generating factors that may affect scanning. This will be discussed in this section.

3.1 Decoding Algorithm

A QR code is a matrix that consists of white and black modules ordered by a specified encoding rule, where each module’s white or black color represents a bit, i.e. a 0 or 1, and 8 bits make up a codeword. Further, the codeword sequence is always subdivided into one or more blocks, and each block is divided into two parts: data codewords and error correction codewords. More information about the generation of QR codes can be found in [1].

Typically, the scanning process involves the use of terminal devices to obtain the QR code image from screens or prints via a camera and then using image processing technology to locate and sample the black-and-white modules. According to the sampled information, the QR code image is transformed into a matrix, which is subsequently decoded using a decoding algorithm[20, 21, 22, 23].

In accordance with the most well-known QR code generating and scanning approach, based on the open-source library ZXing [20], we divide the scanning process into three steps: preprocessing, detection and recognition. Preprocessing includes transforming the RGB image into a grayscale one, and then converting the 256-gray intensity level image into a binary one. Detection involves finding and confirming the exact location of QR code, while recognition consists of down-sampling and decoding, as shown in Fig. 2.

3.1.1 Preprocessing

Images captured from the camera and transformed into a binary one during the preprocessing step, which mainly contains two steps. Firstly, converting the RGB image into a single-channel one, using the equation below:

(1)

where , , , is the output value of the single-channel image at position of , and , , respectively correspond the input values of the red, green, and blue channels of the color image at .

Then, converting an image into a binary one is also based on the commonly used approach of thresholding:

(2)

where and are respectively the binary result and threshold at position , is a function which assigns the value 1 to when , and 0 otherwise.

Choosing a proper threshold is key to the binarization effect that influences the performance of QR code detection, and ZXing uses a hybrid local block averaging method for it. That is, the obtained image is first divided into image blocks, the size of each is 8-by-8, without overlapping, and the average of each block is calculated. Then, a 5-by-5 set of blocks is formed around each block and the averages of these blocks are calculated as the final threshold.

To facilitate the discussion, in the following, the -th block is denoted as , while the set of its neighboring blocks is denoted as . When pixel is located inside block , this is written as . In the same block, all the pixels will be compared against the same threshold . So, for any pixel , we have

(3)

For general QR codes, this threshold method helps reduce the scanning error, especially for different illumination environments. However, for aesthetic QR codes, the image color will seriously affect the threshold calculation, and it is easy to obtain an unexpected binary result because a too high or too low value of the threshold is calculated. As a result, the reduction of this effect is the key factor for maintaining scanning robustness.

Fig. 2: QR code scanning process

3.1.2 Detection

In the detection phase, the main task is to locate the position of the QR code from the binary image generated after preprocessing. Pattern matching is used as the primary method for locating the finder pattern.

The finder pattern is located on three corners of the QR code image and each component comprises three concentric squares. The three squares are formed by black 7-by-7 modules, white 5-by-5 modules, and black 3-by-3 modules, respectively. During detection, a black-white-black-white-black pattern with corresponding ratios of 1:1:3:1:1, corresponding to the cross-section of the finder pattern, will be matched in order to quickly identify the existence of a QR code. Finally, according to the relative positions of the 3 finder patterns, the exact position and direction of the image are confirmed.

3.1.3 Recognition

The process of recognizing QR codes mainly includes sampling and decoding. Sampling refers to first estimating the size of each module, and then obtaining the number of the modules. After the center pixel of each module is sampled to obtain the information of the whole module, finally, a matrix is formed once all the modules have been sampled.

On the other hand, the decoding process involves examining the matrix that is obtained during sampling and parsing the contained information. This involves inversely resolving according to the rules of QR encoding, including data masks, codeword rearrangements, error correction, and decoding. The Reed-Solomon algorithm is used in the process of encoding, and is thus necessary for decoding error correction. When the errors in a block are overwhelming, the original data will not be retrievable, resulting in the failure of resolving the QR code.

Fig. 3: Error analysis of the intermediate results of scanning different QR codes using the ZXing library. The last column shows the error images, where errors are depicted in red and correct measurements in green. a. QR code image generated using QART; b. QART QR code image captured using a mobile device under ordinary illumination; c. QR code image generated using CA; d. CA QR code image captured using a mobile device under ordinary illumination.
Fig. 4: Overview of scanning-robust aesthetic QR code generation process. Lines of different colors represent different stages. 1⃝ the binary aesthetic stage; 2⃝ the grayscale aesthetic stage; 3⃝ the color aesthetic stage

3.2 Scanning Error Analysis

When the QR code is located correctly and the data matrix obtained finally is decoded properly, we regard this code to be an accepted one. However, during actual scanning processes, all kinds of noise and errors may occur. To further analyze and show the various errors produced during the scanning process, we conducted two typical aesthetic QR code implementations for experimentation, QART[2] and CA[5]. As shown in Fig. 3, we used the ZXing library to compare and maintain the intermediate experimental results for further identification of errors.

In Fig. 3, column 1 shows the input images. Apart from displaying the scanning process of the original QR code images (Rows (a) and (c)), we used mobile devices to obtain a group of images (Rows (b) and (d)) to compare results with those obtained in conditions of ordinary illumination, i.e. simulating the daily scanning. Column 2 shows the binarization results obtained after processing using the ZXing library. Column 3 displays the downsampling results of QR codes after detection, that is, the matrix of sampling results. In the end, we compare the real sampling result matrix with the expected one, generating Column 4, with red for errors and the green for correctly identified values, which directly shows where errors were generated.

As evident from the results, images a-(4) and b-(4) are totally green, which indirectly shows the good scanning robustness of QR codes generated by QART. Regardless of whether they were scanned directly or using mobile devices, the results turned out to be in line with expected ones. However, images c-(4) and d-(4) contain errors, and the errors of d-(4) are much more severe than that of c-(4).

According to scanning principles, if there are errors in the matrix of sampling results, they need to be corrected by performing data correction such as using the Reed-Solomon code. When the percentage of the generated errors exceeds a certain threshold, error correction may fail, which results in a scanning failure.

This paper proposes a method to estimate the probability of produced errors within a module during the generation process and taking actions to prevent the errors occurring. It obtains the final correct sampling result matrix and improves the scanning robustness, where the details will be described in Section 4.2.1.

4 ART-UP Design and Implementation

To present the generation of ART-UP codes more clearly, we divide the generation process into three stages: the binary aesthetic stage, the grayscale aesthetic stage, and the color aesthetic stage. In the binary aesthetic stage, codeword adjustment is achieved using visual saliency and Gauss-Jordan elimination, and the order of modules is adjusted to match the binarization result of the input image, which avoids visual conflicts globally. In the grayscale aesthetic stage, thresholding error and sampling error models are established by analyzing the scanning process of general scanners to simulate the process of thresholding and downsampling. In this manner, the probability of correctly sampling each module can be estimated and used as feedback for improving the generation process and ensuring the scanning robustness. In the color aesthetic stage, the pixels of each channel are calculated accurately to create the colored aesthetic QR codes by establishing a linear-based solution, which adjusts the luminance of original images to match the grayscale QR codes.

Fig. 4 shows the whole process of generating ART-UP codes, where is the binary aesthetic QR code, is the grayscale aesthetic one, and is the colored aesthetic one. In the following three Sections, the methods adopted for each stage will be discussed in details.

4.1 Binary Aesthetic QR Codes

The generation of binary aesthetic QR code is the main pre-step for the generation of grayscale code. The result will directly determine the structural visual effect[24] of the final colored code. In this step, we first generate a grayscale image in accordance with the original one and then generate a binary image using a module-based thresholding algorithm. After that, the codewords of the original QR code will be adjusted according to the priority weight in order to generate the target binary aesthetic QR code .

4.1.1 Module-Based Binarization

To make the generated binary aesthetic QR code’s similar to the original one, we need to generate a binary image by processing the original QR code. The side length () of the grayscale image is usually much larger than the length of individual modules (, where is the version of the QR code), so if we only scale the image to obtain its binary image, the visual effect is not ideal. Here, the approach used in [5] is adopted to avoid such phenomena. First, the image is broken down into several square modules. Let the side length of each such module be , that is, equal in size to the QR code module. Finally, we mark the -th module as , so the binarization of the module is given by

(4)

where stands for the -th element of image , is the weight of the pixel , and . In this paper, is a Gaussian Distribution:

(5)

belongs to , and refers to the weight of the -th module at the position of . It is necessary to note that in Equation 5, the position of the -th module corresponds to the global position of . In addition, in our experiment, we set to be .

4.1.2 Codewords Adjustment

From the analysis of QR code scanning principles, it is evident that the scanned QR code will be accepted by the scanners as long as the order of the modules passes the error correction stage. QR codes employ Reed-Solomon encoding, which has the following properties: 1) It is a systematic code, where the input data is embedded in the final encoded output. That is, the first half is the original input code, with the latter half being the error correction code; 2) The result of an exclusive-or between two different Reed-Solomon encoded blocks, is also a valid Reed-Solomon encoded block.

During encoding, each block is an independent Reed-Solomon code, and a block can be separated into 3 areas. One is for input data bits, one for padding bits, and the rest for correction bits, with the corresponding lengths being , , and , respectively. According to the QR encoding rules, when the version and error correction level of a QR code are confirmed, and are fixed constants, while and may change according to the size of the input data.

To describe the process of codeword adjustment clearly, consider Fig. (a)a as an example. The first row contains the original information. In order to generate the aesthetic binary QR code in Fig. 4 similar to the binary image , let us assume that the -th ( and ) digit needs to be adjusted from 1 to 0 in the block. Then, a special operator is constructed whose -th digit is 1 while the other digits in the input data and padding bits are 0. The correction bits are generated by the checking rules of Reed-Solomon coding. According to Property 2, a legal Reed-Solomon code block will be obtained through the exclusive-or operation of the special operator and the original code. The generated code will maintain the other digits of the input data area and padding area unchanged, while the -th digit will go through an inversion operation.

As described above, a group of operators can be constructed, as shown in Fig. (a)a. There are operators marked as , respectively, where digits , , , of the corresponding operators are 1, while the rest of the digits of the input data area and padding area are 0. That is, there is only one row with a 1 for each controllable module in the padding data area, which is marked with red. With this group of operators, we can apply an exclusive-or operation on the current data with to manipulate the -th digit to the desired value without affecting any of the other controllable modules. The intermediate result adjusted by Set is shown in Fig. (c)c.

In order to maintain the input data bits unchanged and the error correction bits updated, the controllable modules must be limited within the padding data area ( and ). However, the Gauss-Jordan elimination method can be adopted to overcome these constraints and allow further flexibility. By combining the operators in set , we can create a new operator set and apply an exclusive-or between the operators to create new basis operator that trades data bits for error correction bits. In this way, the uncontrollable modules are dispersed, and the noise pixels are distributed over the canvas, as shown in Fig. (d)d.

It should be noted that the input data bits (including the ending indicator) have not been changed during this process, so according to Property 1, the information contained in the QR code is not affected by the codewords’ adjustment, which is necessary to ensure a correct decoding outcome.

4.1.3 Visual Saliency Optimization

As known above, during codeword adjustment, the controllable modules can be only moved within a certain range due to the QR codes’ encoding rules’ limitation, no new modules can be added. In each block, the color of the controllable modules can be freely controlled, while the other areas will be affected by the input data and the checking algorithm. Therefore, a priority algorithm is proposed on how to choose controllable modules.

In this paper, a linear combination of saliency, edge detection, and heuristic constraints is used to determine this priority for each module of QR code:

(6)

and are the results of edge detection and visual saliency respectively for image . To make the and equal in size, a mean-pooling operation is carried out. defines some heuristic rules, so that in cases where two modules have similar and , it allows the one which is closer to the image center to obtain a larger weight.

(7)

In Equation 7, refers to the heuristic variable’s result at the module position . is the number of the modules on each side of the code, that is, . In this paper, the Canny [25] and Region-based Contrast (RC) [26] methods are adopted for the edge detection and saliency area extraction, respectively. After normalizing , and , the values of , , and are determined to be 0.67, 0.23 and 0.10 respectively in our experiments. Although only has a small weight, for most of images provided by users, such as logos and human faces, the choice of controllable modules always plays a key role in the final processing.

(a)
(b)
(c)
(d)
Fig. 5: Codeword adjustment using the Gauss-Jordan elimination method. a. Generation of the operator set; b. Original QR code; c. Binary result of codeword adjustment by operator set A; d. Binary result of codeword adjustment by operator set B;

4.2 Grayscale Aesthetic QR Codes

To further enrich the details of QR code, the visual effect needs to be optimized by combining it with an image to generate a grayscale one. As shown in Fig. 4, first the initial local threshold distribution according to the grayscale image and binary aesthetic QR code is pre-estimated. Then, a module-based luminance modification algorithm is used to adjust the pixels of the modules in the corresponding grayscale image , ensure scanning robustness in accordance with the expected probability distribution , and combine it with to generate a grayscale QR code . After that, the local threshold distribution of the real image is recalculated and updated through , and luminance modification is used to recreate . With continuous iterations that make the local threshold distribution gradually steady, the final is obtained.

During generation of grayscale aesthetic QR codes, a module-based scanning probability estimation model is proposed in this paper, which is used to estimate the scanning robustness of the QR code. After that, an iterative image color updating algorithm is proposed to adjust the grayscale aesthetic QR code, making it robust and visually appealing. These will be introduced in the following sections.

4.2.1 Module-based Scanning Probability Estimation

According to the scanning process analyzed in Section 3.1 and the scanning error analysis of Section 3.2, we classify errors resulting from scanning failure into two types: thresholding errors and sampling errors. Thresholding errors mainly refer to those that result from the difference between the real and expected threshold due to the environment, camera devices, and image content local thresholding. Sampling errors occur during sampling because of location and scaling errors, where the real sampling position is far from the module center. In the following, we will analyze these in detail and propose the module-based scanning probability estimation model.

Thresholding Error Estimation

As can be inferred from Section 3.1.1, generic QR code scanners use the thresholding method based on hybrid local block averages. In theory, the threshold applied to a certain pixel only depends on image blocks and their neighboring pixels during the thresholding process. However, thresholding problems are complicated in reality, the following factors must be taken into account: 1) In most realistic scenarios, the colors of the obtained pictures may differ from the original ones as they are affected by environmental conditions; 2) In the thresholding method based on the hybrid local block averages, the captured image is divided into non-overlapping image blocks at first, the adapted thresholds are the same within the same block, but actually the position of the pixels cannot be confirmed during separation because of the unknown position and angle of the obtained QR code in the image; 3) In theory, the size of the image blocks is defined accurately, but in reality due to the scanning distance and image scaling, the relative size of the image block and the QR code module is not fixed.

Although an accurate determination of actual threshold values of pixels may be difficult to achieve, good estimations may be possible under certain assumptions. Firstly, as for the problem of color deviation, usually the captured color is similar to the real color. Without considering the limitations of color thresholding, we can suppose the probabilities that the captured color is more or less intense than the real one are equal, and with the increase of the changing amplitude, these probabilities are gradually reduced. Secondly, due to the randomness of the position and angle of the code within an image, resulting in random pixel locations for the blocks, we may as well assume that the pixel is always located in the center of the block. In this way, in accordance with Equation 3, the thresholding method based on hybrid local block averages can be replaced by the thresholding method based on the local average, that is, the average of the neighboring pixels can be used to determine the threshold. Luckily, even if the pixel is not located in the block’s center, its real threshold can be estimated from the surrounding area assuming that the local image area exhibits color continuity. In addition, the influence of the scanning distance and image scaling have on the real threshold is random, and depends on the scanning performed by the users, resulting in an unknown percentage of the QR code lying within the image, usually clustered within a certain area, which means that the threshold to change within the corresponding ranges.

According to the above analysis and observations, it is evident that for the QR code, the same pixel under different environments may be compared to different thresholds, resulting in different binarization results. General methods of generating aesthetic QR codes are usually applied with the assumption that the thresholds are constant. This assumption leads to a reduction in scanning robustness. In this paper, in order to simulate the real thresholding process, we assume that the expected threshold of a certain pixel in an aesthetic QR code is obtained from the average of its surrounding pixels, denoted as . The actual threshold value, which is affected by the environment, lies close to this value, and follows a Gaussian distribution , with being its maximum value.

(8)

Meanwhile, according to the above, we have

(9)

, Where represents the set of pixels neighboring the center . In this paper, is a square area, with the side length of , where is the side length of the QR module.

When we obtain the distribution of thresholds in position , we compare the original pixel value with the real threshold of this pixel, according to Equation 2. In the meantime, due to the limitations of color thresholding, the real threshold is actually distributed in the range of . Therefore, the probability that the thresholding result equals 0 or 1 is, respectively

(10)
(11)

Similarly, because of the limitations of color thresholding, the sum of the two equations above will not be equal to 1. In this paper, it is normalized for adjustment, so when the pixel is located in the module , that is, , the probability of correctly thresholding the pixel is

(12)

In Fig. 6, the - curves under different values of are shown for , . From this, it can be derived that when the expected threshold of the pixel is fixed, the probability of correctly thresholding the pixel is monotonically related to the real color luminance, that is, we can adjust the grayscale value of a pixel to raise the probability and thus increase robustness.

Fig. 6: Variation curve of with corresponding to different values of when the color of corresponding module is black for a binary aesthetic QR code ().
Sampling Error Estimation

According to Section 3.1.3, in ideal conditions, the downsampling results only relate to the center pixel of the module in the scanning process. However, in real environments, images captured by cameras are likely to be different from the original ones, due to rotations, scaling, and even transfiguring, which result in the sampling error being affected by the surroundings.

Furthermore, this sampling error can hardly be simulated realistically, as this would require different application surroundings and large amounts of experiments. In this paper, a reasonable assumption is made that the probability of a pixel being sampled in a module follows a Gaussian distribution. As shown in Fig. 7, in a module, the closer the point is to the center, the more likely it will be sampled.

(13)

In addition, in the whole module, the sum of the probabilities for sampled pixels is 1, that is, .

Fig. 7: The probability distribution of pixels being sampled within a module. We assume that the module is sampled one thousand times, and each sample corresponds to a blue point.
Probability of Correctly Scanning a Module

As can be inferred from the scanning process analysis, thresholding and downsampling are performed during different steps of the process, and the result of thresholding a pixel is independent of sampling. In this way, a pixel will be selected as the sampling result according to the sampling process, so the probability of sampling this pixel as the correct result is:

(14)

Therefore, the probability of correctly scanning a module is

(15)

4.2.2 Luminance Adjustment

To balance the scanning robustness and the visual quality of aesthetic QR code, in this section, a luminance modification algorithm within the module is proposed to reconcile the binary aesthetic QR code and the grayscale image . In addition, the expected threshold of the pixels is established by means of an iterative algorithm. In this manner, the grayscale aesthetic QR code is finally obtained.

0:  ,,, for , and
0:   in module of
1:  Calculate for , see equations 8,10,11,12
2:  
3:  while true do
4:      
5:      if  then
6:          break
7:      end if
8:      
9:      if  then
10:          break
11:      end if
12:      for each  do
13:          
14:          if  then
15:              
16:              
17:          end if
18:      end for
19:  end while
20:  for each  do
21:      find
22:      
23:  end for
Algorithm 1 Luminance Modification Algorithm
Luminance Modification Algorithm

The whole QR code is made up of many modules. By estimating the probability of correctly scanning a single module, we can indirectly work out and adjust the error probability in the matrix of the sampling result. To guarantee the code’s scanning robustness, this probability must be modified accordingly so as to be able to recover from errors that may occur through error correction.

In this section, we assume that the expected threshold of each pixel during the thresholding process is a known variable. So, according to Equation 12, the corresponding can be calculated. Also, in accordance with Section 4.2.1, the probability of correctly scanning each module in the grayscale aesthetic QR code can be calculated, under the constraint of correctly scanning the modules in order to maintain the scanning robustness of the whole QR code,

(16)

where is the minimum probability constraint for the expected correct scanning of the -th module. Here, an initial fixed value can be given, such as . The choice of this value will be discussed later in Section 4.4.

As can be inferred from Equation 13, after confirming the size and version of the QR code image, the probability of each pixel in the module is actually a fixed value. So, according to Equation 14 and 15, when , where the probability of correctly scanning the module is lower than the expected one, we have to adjust to meet the constraint by increasing the probability of correctly thresholding the pixels in the module. In order to achieve this, the following equation is formed for adjusting the thresholding probability within a module:

(17)

where is the weight for adjusting the thresholding probability at position . It is not difficult to see that when the equation is multiplied by , and all the elements within the module are summed, can be maintained. In this manner, the probability of scanning the module can be updated to become larger than or equal to .

However, Equation 17 ignores the limitation of , that is, . During the actual updating process, this may result in a phenomenon where the updated probability of correctly scanning the module may be smaller than expected. Therefore, a simplified iterative updating algorithm is proposed (Algorithm 1). During the iterations, is adjusted dynamically according to , thus solving the above problems.

Through this algorithm, the correct thresholding probability of each pixel in an image can be assigned to acquire and make the probability of correctly scanning each module meet the expected value. As evident from Fig. 6, the grayscale value that each pixel adopts reflects the probability of correctly thresholding pixel when is fixed. So, by looking up in the table, the grayscale value corresponding to can be easily obtained and the luminance modification of the pixels in an image can be accomplished.

0:   for each and for each
0:  Grayscale aesthetic QR code
1:  Calculate for each
2:  Initialize
3:  Initialize ,,
4:  while true do
5:      
6:      for each  do
7:          
8:          
9:      end for
10:      if  then
11:          if  then
12:              break
13:          end if
14:      end if
15:      
16:      
17:      for each  do
18:          prepare , as ,, for , and
19:          update each using algorithm 1 for
20:      end for
21:      
22:      
23:  end while
Algorithm 2 Threshold Estimation Algorithm
Threshold Estimation Algorithm

In Algorithm 1, it was assumed that the adopted expected threshold of each pixel is known, and the luminance of the pixels was adjusted based this threshold. However, according to Equation 9, is related to luminance . So, when adjusting the luminance of the pixels, the expected threshold will be changed.

To find a proper threshold which makes the adjusted luminance of the pixels in line with the assumed expected threshold, an iterative threshold estimation algorithm (Algorithm 2) is developed, which estimates the expected threshold and gradually converges to the desired value.

According to observations, the algorithm iterates effectively, and the convergence of is very fast even with an increased number of iterations. For example, in our experiments, dealing with a 512-by-512 image only needs about 10 iterations to reach a steady state. The generation of a specific QR code is shown in Fig. 8. As the number of iterations grows, the percentage of pixels that need updating decreases quickly, and the appearance of the QR code is accordingly stabilized.

Fig. 8: The percentage of the modified area decreases quickly as the number of iterations increases.

4.3 Color Aesthetic QR Codes

After the generation of the grayscale QR code has been completed, it needs to be transformed into a color aesthetic one under the constraint of not changing the luminance. As can be inferred from Equation 1, the mapping between RGB and grayscale is many-to-one, so there are many ways to convert three-color-channel images to single-channel grayscale ones.

During color aesthetic QR code generation, not only must the luminance be maintained, but the generated color image needs to resemble the original image to guarantee the visual effect. The general method is to first convert from RGB space to another color space (such as HSL, LAB), then to keep other channels from changing while only adjust the luminance channel until the corresponding grayscale image equals , and finally convert the image back to RGB space. However, when converting the color space, the range of color expression is always limited, so simply adjusting luminance may not meet the expected requirements in some cases. In addition, luminance adjustment is usually transformed into an optimization problem, which increases the cost of calculation[9].

Inspired by the bilinear interpolation algorithm, in this paper a linear-based luminance adjustment algorithm is proposed, which alleviates the above problems. In this method the process of luminance adjustment as is treated as a linear interpolation process. For convenience, we first define , which represents the minimum or maximum luminance value of position . That is, when ,

(18)

After that, according to the principles above, let

(19)

where is a variable, which is mainly used for luminance adjustment, whose range is from 0 to 1. While is the output color value of the color image at . Here the symbol will be used to denote . As can be inferred from Equation 1, in the process of scanning , if the corresponding grayscale image is to match , then

(20)

Therefore, by multiplying both sides of Eq. 19 by , we have:

(21)

which, when combined with the previous equation, gives us:

(22)

Combining Equation 22 and Equation 19, we finally have

(23)

As can be seen from the above, all the variables on the right side are known, therefore it is easy to combine the grayscale aesthetic QR code with the original image , to generate a color aesthetic QR code . Two examples of grayscale-to-colored QR codes are shown in Fig. 9.

(a) Case I
(b) Case II
Fig. 9: Example of grayscale-to-colored QR code conversion

4.4 Parameter Analysis

Through the above analysis, a complete aesthetic QR code generation algorithm is proposed focusing on scanning robustness. The matrix is used to ensure the scanning robustness of the generated code and to initialize the weight of each module. However, the question remains regarding the choice of these values for actual applications, and their corresponding influence on the generation of aesthetic QR codes.

Firstly, to study the effect of on the generated QR code, we conduct a set of experiments. We first investigate each element of be equal, that is, . Then, a dataset containing 300 images was created, and aesthetic QR codes were generated for each value of in the range 1.00, 0.99, …, 0.00, respectively to create a total of 30300 QR code images. By scanning the resulting images, a curve which describes the relationship between the rate of scanning success and was obtained. As shown in Fig. 10, with the decrease of , the rate of scanning success also declines, and gradually reaches 0.

Fig. 10: Influence of on the aesthetic QR code generation

In the generation algorithm of the aesthetic QR code, scanning robustness is mainly affected by ensuring that , i.e. the probability of correctly scanning a single module, is not less than . When it comes to thresholding, if each module is considered as an independent experiment of success or failure, the probability of successfully scanning the whole QR code is equal the percentage of instances where the failure rate is less than the necessary threshold for recovering the codeword. Therefore, improving the of a single module results in an increase of the overall scanning robustness of the QR codes.

As we observed in Fig. 10a-f, the larger is, the more similar the result image and the binary aesthetic QR code will be. When , that is, for all , the generated color aesthetic QR code is exactly the same as the binary aesthetic QR code. On the contrary, the smaller is, the more similar it will be to the original image. When , except for some functional pattern area of the generated color aesthetic QR code, the remaining areas resemble the original image exactly.

As can be inferred from the above discussion, for a single module, the larger is, the larger the probability of it being scanned correctly will be. In real applications, setting for all -s is not required, so is usually adjusted locally in order to obtain a better aesthetic effect. For instance, one can use

(24)

to keep in the range of .

Finally, the influence of on the generated color aesthetic QR code is discussed. In Algorithm 1, it is used to initialize the range-adjusting weight of the module’s probability. In the example mentioned above, it is treated as equal to by default. That is, as a 2-dimensional Gaussian distribution, in order to have a prior to the color adjustment of the module’s center. However, in reality, more styles of color aesthetic QR code results can be obtained by using different values. Fig. 11 shows some results of the same image for 6 different initializations of . The scanning robustness can still be maintained for different appearances, which indicates that the constraint of is effective.

The use of the variables ( and ) provides flexibility to the generation of color aesthetic QR, which enables users to generate aesthetic QR codes of different robustness and visual styles according to their requirements.

Fig. 11: Aesthetic QR codes of different styles using different initial values. a. using a 2-D Gaussian distribution; b. Constant value for all ; c. initialization with a random in the range of [0,1]; d. Initialization using pixel values from another image, for example, to generate a patch of QR in the bottom-right of the image; e. Greater central than peripheral weights to formulating an image similar to Visualead’s approach; f. Initialization using superposition of Gaussian matrix and image edge detection result.

5 Experiments

In this section, the visual effects and the robustness of QR codes are evaluated through extensive experiments. First of all, a user survey was conducted to compare the visual effects of different methods. After that, the robustness of QR codes was quantified and compared through decoding experiments and user evaluations.

5.1 Visual Effects

To evaluate the visual effect of color aesthetic QR code, a user survey was conducted. Different types of images were first chosen, including cartoons, sceneries, buildings, animals, human faces, brands, etc. After that, 4 different methods of generating aesthetic QR codes were applied to generate the corresponding code for each image. Among them, Halftone[3] is a popular generation method of aesthetic QR codes. Visualead[19] is a widely used commercial algorithm, and Efficient[7] is the latest QR code visual beautification algorithm which has only recently been presented. We use these methods to obtain 20 groups of images in total, each group containing an original image and 4 QR code images generated using different methods. In this way, an evaluation questionnaire for the visual effect was created. The sample images are shown in Table I.

TABLE I: Examples Generated using Different Algorithms
(a) Aesthetic Measure
(b) Similarity Measure
Fig. 12: User study about aesthetic and similarity

40 volunteers, 25 males and 15 females, were invited to participate in the survey. The images of each group were displayed, they were asked to give subjective scores ranging from 1 to 5 according to the observed degree of visual appeal and similarity to the original images.

As shown in Fig. 12, the visual effect of the QR code images generated by the four algorithms and the average scores of the similarities between the generated images and the original ones were evaluated.

It can be seen that the scores of the two measures are similar, and the visual effect generated by ART-UP which this paper proposes resembles the state-of-art Efficient method [7]. Besides, it behaves much better than the Visualead [19] and Halftone [3] algorithms.

Fig. 13: The test method for scanning robustness of angle variation. The QR code was placed in the center of the coordinate system, with the camera a fixed distance from the origin and aimed toward the center.

5.2 Scanning Robustness

Scanning robustness of the QR codes was evaluated from various aspects, such as scanning angle variation, brightness variation, scale variation, coverage, and scanning in real environments. 30 images were randomly selected from the dataset to establish a sub-dataset from the 300-image dataset, and the robustness tests were carried out on this subset.

Scanning Angle Variation. QR code images were placed in the origin of a 3-D axes system, and the camera was rotated about the -, - and -axes respectively, as shown in Fig. 13. For the - and -axes, the images were sampled in steps ranging from to . For the -axes, the images were sampled in steps ranging from to . In this way, 972 measurements were obtained. The corresponding results are shown in Fig. LABEL:fig_sub_x_variation,LABEL:fig_sub_y_variation,LABEL:fig_sub_z_variation, respectively. Meanwhile, the angle between the plane of the QR code and the one of the camera was also calculated and the corresponding results are shown in Fig. LABEL:fig_sub_plate_variation. As can be inferred from the four figures, ART-UP is slightly lower than that of Visualead’s state-of-the-art commercial algorithm in terms of scanning angle robustness, but better than the other two methods.

Brightness Variation. QR code scanning is easily affected by illumination and capturing devices, resulting in differences between the captured and the real luminance. To study this effect further, it was simulated with a linear brightness adjustment, adjusting the brightness of each image varying to . That is, each QR code image results in 511 images of different luminance levels. As shown in Fig. LABEL:fig_sub_brightness_variation, with regard to small-range brightness change, ART-UP is obviously more robust and immune to illumination changes among all methods.

Scale Variation. For aesthetic QR codes, scale variation is always a key factor that affects scanning. Whether printed or shown on a screen, QR codes face scaling issues. For each image in the dataset, a QR code image was generated, and was sampled at a step per frame for a scaling ratio ranging from to . That is, each QR code image yielded 61 images at different scaling ratios. The image scaling was performed through bi-cubic interpolation. As shown in Fig. LABEL:fig_sub_size_variation, ART-UP demonstrates the best performance compared to the other three schemes when faced with the scale variation and it is consistent. Efficient is consistent but has a low scanning success rate, while Visualead shows serious oscillation and inconsistency.

Coverage. QR codes allow a considerable percentage of error correction, which will be affected when the code is covered or has a missing segment. Each QR code of the image was replaced with a fixed number of square black-or-white blocks in random positions. After 30 rounds of repeated testing, the average performance was determined. As shown in Fig. LABEL:fig_sub_cover_variation, ART-UP, Visualead, and Halftone show similar performance, while the Efficient algorithm suffers more in these cases.

Scanning in realistic conditions. Besides simulating scanning experiments, one subjective experiment was carried out, where 30 volunteers were invited to vote for the QR codes’ performance under real scanning circumstances. In the experiment, no constraint on the scanning tools, scanning environments or scanning methods was set. Volunteers were shown the QR codes generated by the different algorithms, and were required to scan each one. The subjective score ranged from 1 to 5 regarding the response time, where a score of 1 indicates a scarce ability to successfully scan the code and the score of 5 represents a prompt response. As shown in Fig. LABEL:fig_sub_real_variation, the result of ART-UP is almost equal to the Visualead’s, and both of them are ranked better than Halftone and Efficient. This experiment demonstrates that the proposed algorithm is robust enough to adapt to most common scenes. Especially when the codes are printed on paper or other materials, the speed of response is still quick and stable.

6 Conclusion

In this paper, a flexible generation algorithm of aesthetic QR codes is proposed, where a QR code is obtained based on a probability model of successfully scanning the QR codes, controlling and adjusting the tradeoff between visual effect and scanning robustness. By adjusting the thresholding and sampling errors, our proposed algorithm can generate aesthetic QR codes with different appearances using a hierarchical, coarse-to-fine strategy. Experimental results show that ART-UP is scanning-robust and can avoid the error occurrence in most realistic scenarios. This algorithm not only provides aesthetic results, but also ensures code usability, and is thus suitable for commercial needs.

References

  • [1] E. ISO/IEC, “Information technology - automatic identification and data capture techniques - bar code symbology - qr code,” 2000.
  • [2] R. Cox, “Qart codes,” http://research.swtch.com/qart, 2012, accessed October 16, 2016.
  • [3] H.-K. Chu, C.-S. Chang, R.-R. Lee, and N. J. Mitra, “Halftone qr codes,” ACM Transactions on Graphics (TOG), vol. 32, no. 6, p. 217, 2013.
  • [4] Y.-H. Lin, Y.-P. Chang, and J.-L. Wu, “Appearance-based qr code beautifier,” IEEE Transactions on Multimedia, vol. 15, no. 8, pp. 2198–2207, 2013.
  • [5] Y. Zhang, S. Deng, Z. Liu, and Y. Wang, “Aesthetic qr codes based on two-stage image blending,” in MultiMedia Modeling.   Springer, 2015, Conference Proceedings, pp. 183–194.
  • [6] S. Ramya and C. S. Joice, “An optimized image and data embedding in color qr code,” Structure, vol. 17, p. 4V, 2015.
  • [7] S.-S. Lin, M.-C. Hu, C.-H. Lee, and T.-Y. Lee, “Efficient qr code beautification with high quality visual content,” IEEE Transactions on Multimedia, vol. 17, no. 9, pp. 1515–1524, 2015.
  • [8] F. J. MacWilliams and N. J. A. Sloane, The theory of error correcting codes.   Elsevier, 1977.
  • [9] G. J. Garateguy, G. R. Arce, D. L. Lau, and O. P. Villarreal, “Qr images: optimized image embedding in qr codes,” IEEE transactions on image processing, vol. 23, no. 7, pp. 2842–2853, 2014.
  • [10] C. Fang, C. Zhang, and E.-C. Chang, “An optimization model for aesthetic two-dimensional barcodes,” in MultiMedia Modeling.   Springer, 2014, pp. 278–290.
  • [11] L. Li, J. Qiu, J. Lu, and C.-C. Chang, “An aesthetic qr code solution based on error correction mechanism,” Journal of Systems and Software, 2015.
  • [12] B. Jiang and X. Liu, “Qr code embellishment with background facial image embedding,” in SIGGRAPH Asia 2014 Posters.   ACM, 2014, p. 31.
  • [13] Z. Baharav and R. Kakarala, “Visually significant qr codes: Image blending and statistical analysis,” in Multimedia and Expo (ICME), 2013 IEEE International Conference on.   IEEE, 2013, pp. 1–6.
  • [14] Y.-S. Lin, S.-J. Luo, and B.-Y. Chen, “Artistic qr code embellishment,” in Computer Graphics Forum, vol. 32, no. 7.   Wiley Online Library, 2013, pp. 137–146.
  • [15] Z. Gao, G. Zhai, and C. Hu, “The invisible qr code,” in Proceedings of the 23rd Annual ACM Conference on Multimedia Conference.   ACM, 2015, pp. 1047–1050.
  • [16] Z. Yang, Y. Bao, C. Luo, X. Zhao, S. Zhu, C. Peng, Y. Liu, and X. Wang, “Artcode: preserve art and code in any image,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing.   ACM, 2016, pp. 904–915.
  • [17] D. Samretwit and T. Wakahara, “Measurement of reading characteristics of multiplexed image in qr code,” in Intelligent Networking and Collaborative Systems (INCoS), 2011 Third International Conference on.   IEEE, 2011, pp. 552–557.
  • [18] A. Laporte, B. Reulier, S. Ternoir, G. Chapuis, and R. Kassel, “Unitag,” https://www.unitag.io, 2011, accessed October 16, 2016.
  • [19] N. Alva, I. Friedman, and U. Peled, “Visualead,” http://www.visualead.com, 2012, accessed October 16, 2016.
  • [20] S. Owen, “Zxing,” Zebra Crossing, 2013.
  • [21] E. Ohbuchi, H. Hanaizumi, and L. A. Hock, “Barcode readers using the camera device in mobile phones,” in Cyberworlds, 2004 International Conference on.   IEEE, 2004, pp. 260–265.
  • [22] Y. Liu and M. Liu, “Automatic recognition algorithm of quick response code based on embedded system,” in Sixth International Conference on Intelligent Systems Design and Applications, vol. 2.   IEEE, 2006, pp. 783–788.
  • [23] Y. Liu, J. Yang, and M. Liu, “Recognition of qr code with mobile phones,” in 2008 Chinese Control and Decision Conference.   IEEE, 2008, pp. 203–206.
  • [24] Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE transactions on image processing, vol. 13, no. 4, pp. 600–612, 2004.
  • [25] J. Canny, “A computational approach to edge detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence(IEEE TPAMI), no. 6, pp. 679–698, 1986.
  • [26] M.-M. Cheng, N. J. Mitra, X. Huang, P. H. S. Torr, and S.-M. Hu, “Global contrast based salient region detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence(IEEE TPAMI), vol. 37, no. 3, pp. 569–582, 2015.

Mingliang Xu is an associate professor in the School of Information Engineering of Zhengzhou University, China, and currently is the director of CIISR ( Center for Interdisciplinary Information Science Research), and the general secretary of ACM SIGAI China. His research interests include virtual reality and artificial intelligence. Xu got his Ph.D. degree in computer science and technology from the State Key Lab of CAD&CG at Zhejiang University.

Qingfeng Li is a master student in Center for Interdisciplinary Information Science Research, Zhengzhou University, China. He received the B.E. degree in Computer Software from Zhengzhou University, in 2014. He is currently working with Prof. Jianwei Niu in Beihang University, his research interests include computer version, image processing, and computer graphics.

Jianwei Niu received the M.S. and Ph.D. degrees in computer science from Beihang University, Beijing, China, in 1998 and 2002, respectively. He was a visiting scholar at School of Computer Science, Carnegie Mellon University, USA from Jan. 2010 to Feb. 2011. He is a professor in the School of Computer Science and Engineering, BUAA, and an IEEE senior member. His current research interests include mobile and pervasive computing, mobile video analysis.

Xiting Liu is currently a senior student in Beihang University, Beijing, China. His major is Computer science and technology. He once studied and worked with Prof. Jianwei Niu, in 2016. His research interests include computer version and image processing.

Weiwei Xu now is a researcher in State Key Lab of CAD&CG, College of Computer Science at Zhejiang University, recipient of the NSFC Excellent Young Scholars Program in 2013. His main research interests are digital geometry processing, physical simulation and virtual reality.

Pei Lv is an assistant professor in Center for Interdisciplinary Information Science Research, Zhengzhou University, China.His research interests include video analysis and crowd simulation. He received his Ph.D in 2013 from the State Key Lab of CAD&CG, Zhejiang University, China.

Bing Zhou is currently a professor in Center for Interdisciplinary Information Science Research, Zhengzhou University, Henan, China. He received the B.S. and M.S. degrees from Xi’an Jiaotong University in 1986 and 1989, respectively,and the Ph.D. degree in Beihang University in 2003, all in computer science. His research interests cover video processing and understanding, surveillance, computer vision, multimedia applications.

Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
199038
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description