1. 程式人生 > >Android USB Camera(1) : 除錯記錄

Android USB Camera(1) : 除錯記錄

1. 前言

前段時間除錯了一個uvc攝像頭,這裡做下記錄。硬體平臺為mt6735,軟體平臺為android 5.0

2. 底層配置

UVC全稱是usb video class,一種usb視訊規範。所有遵循uvc協議的攝像頭都不需要安裝額外的驅動,只需要一個通用驅動即可。Linux核心已經集成了uvc驅動,程式碼路徑是kernel-3.10/drivers/media/usb/uvc/

2.1 開啟配置

Linux核心需要開啟以下配置來支援uvc裝置

CONFIG_MEDIA_SUPPORT=y
CONFIG_MEDIA_CAMERA_SUPPORT=y
CONFIG_VIDEO_DEV
=y CONFIG_VIDEO_V4L2=y CONFIG_VIDEOBUF2_CORE=y CONFIG_VIDEOBUF2_MEMOPS=y CONFIG_VIDEOBUF2_VMALLOC=y CONFIG_MEDIA_USB_SUPPORT=y CONFIG_USB_VIDEO_CLASS=y

MTK平臺還需要額外開啟otg配置

CONFIG_USB_MTK_OTG=y 
CONFIG_USB_MTK_HDRC=y 
CONFIG_USB_MTK_HDRC_HCD=y

插入攝像頭,如果生成了/dev/video0裝置節點,則證明uvc攝像頭已經載入成功了。成功生成驅動節點後還需要為它新增許可權

2.2 新增許可權

在uevent.rc中加入

/dev/video0               0666   root       root

在system_app.te中加入

allow system_app video_device:chr_file { read write open getattr };

2.3 Debug

如果沒有出現/dev/video0節點,需要先判斷是否列舉成功。在shell終端cat相關的節點查詢

cat /sys/kernel/debug/usb/devices

如果該攝像頭列舉成功,則能找到對應的裝置資訊

T:  Bus=01 Lev=00 Prnt=00 Port=00 Cnt=00 Dev#=1 Spd=480 MxCh=1
D: Ver=2.00 Cls=00(>ifc) Sub=00 Prot=00 MxPS=64 #Cfgs=1 P: Vendor=18EC ProdID=3399 Rev=0.00 S: Manufacturer=ARKMICRO S: Product=USB PC CAMERA

如果列舉成功則需要判斷當前的usb攝像頭是不是遵循uvc協議的攝像頭。將usb攝像頭插到PC上(ubuntu作業系統),通過”lsusb”命令查詢是否有視訊類介面資訊

lsusb -d 18ec:3399 -v | grep "14 Video"

如果該攝像頭遵循UVC協議,則會輸出以下類似資訊

bFunctionClass 14 Video
bInterfaceClass 14 Video
bInterfaceClass 14 Video
bInterfaceClass 14 Video

其中18ec:3399是攝像頭的vid和pid,而14 video代表uvc規範

2.4 幾個比較有用的除錯命令

開啟/關閉linux uvc driver log

echo 0xffff > /sys/module/uvcvideo/parameters/trace //開啟
echo 0 > /sys/module/uvcvideo/parameters/trace //關閉

獲取詳細的usb裝置描述符

lsusb -d 18ec:3399 –v

3. 上層應用

v4l2 - Video for Linux 2,是Linux核心中關於視訊裝置的核心驅動框架,為上層的訪問底層的視訊裝置提供了統一的介面。同時是針對uvc免驅usb裝置的程式設計框架,主要用於採集usb攝像頭等。

MTK標準的Camera並沒有採用v4l2框架,所以需要在jni層實現基本的v4l2視訊採集流程。

3.1 操作流程

在v4l2程式設計中,一般使用ioctl函式來對裝置進行操作:

extern int ioctl (int __fd, unsigned long int __request, …) __THROW;

__fd:裝置的ID,例如用open函式開啟/dev/video0後返回的cameraFd;
__request:具體的命令標誌符。
在進行V4L2開發中,一般會用到以下的命令標誌符:
VIDIOC_REQBUFS:分配記憶體
VIDIOC_QUERYBUF:把VIDIOC_REQBUFS中分配的資料快取轉換成實體地址
VIDIOC_QUERYCAP:查詢驅動功能
VIDIOC_ENUM_FMT:獲取當前驅動支援的視訊格式
VIDIOC_S_FMT:設定當前驅動的視訊格式
VIDIOC_G_FMT:讀取當前驅動的視訊格式
VIDIOC_TRY_FMT:驗證當前驅動的視訊格式
VIDIOC_CROPCAP:查詢驅動的修剪能力
VIDIOC_S_CROP:設定視訊訊號的邊框
VIDIOC_G_CROP:讀取視訊訊號的邊框
VIDIOC_QBUF:把資料放回快取佇列
VIDIOC_DQBUF:把資料從快取中讀取出來
VIDIOC_STREAMON:開始視訊採集
VIDIOC_STREAMOFF:結束視訊採集
VIDIOC_QUERYSTD:檢查當前視訊裝置支援的標準,例如PAL或NTSC。
這些IO呼叫,有些是必須的,有些是可選擇的。

在網上有開源的應用simplewebcam,它已經實現了基本的v4l2視訊採集流程。大概看下它是怎麼做的

操作流程

v4l2

3.2 具體程式碼實現

(1) 開啟裝置驅動節點

int opendevice(int i)
{
    struct stat st;

    sprintf(dev_name,"/dev/video%d",i);

    if (-1 == stat (dev_name, &st)) {
        LOGE("Cannot identify '%s': %d, %s", dev_name, errno, strerror (errno));
        return ERROR_LOCAL;
    }

    if (!S_ISCHR (st.st_mode)) {
        LOGE("%s is no device", dev_name);
        return ERROR_LOCAL;
    }

    fd = open (dev_name, O_RDWR);

    if (-1 == fd) {
        LOGE("Cannot open '%s': %d, %s", dev_name, errno, strerror (errno));
        return ERROR_LOCAL;
    }
    return SUCCESS_LOCAL;
}

(2) 查詢驅動功能

int initdevice(void) 
{
    struct v4l2_capability cap;
    struct v4l2_format fmt;
    unsigned int min;

    if (-1 == xioctl (fd, VIDIOC_QUERYCAP, &cap)) {
        if (EINVAL == errno) {
            LOGE("%s is no V4L2 device", dev_name);
            return ERROR_LOCAL;
        } else {
            return errnoexit ("VIDIOC_QUERYCAP");
        }
    }

    if (!(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE)) {
        LOGE("%s is no video capture device", dev_name);
        return ERROR_LOCAL;
    }

    if (!(cap.capabilities & V4L2_CAP_STREAMING)) {
        LOGE("%s does not support streaming i/o", dev_name);
        return ERROR_LOCAL;
    }

    ......

}

(3) 設定視訊格式

int initdevice(void) 
{
    struct v4l2_capability cap;
    struct v4l2_format fmt;

    ......

    CLEAR (fmt);
    fmt.type                = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    fmt.fmt.pix.width       = IMG_WIDTH; 
    fmt.fmt.pix.height      = IMG_HEIGHT;
    fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_MJPEG;

    if (-1 == xioctl (fd, VIDIOC_S_FMT, &fmt))
        return errnoexit ("VIDIOC_S_FMT");

    ......
}

(4) 申請幀快取並對映到使用者空間

int initmmap(void)
{
    struct v4l2_requestbuffers req;

    CLEAR (req);
    req.count               = 4;
    req.type                = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    req.memory              = V4L2_MEMORY_MMAP;

    if (-1 == xioctl (fd, VIDIOC_REQBUFS, &req)) {
        if (EINVAL == errno) {
            LOGE("%s does not support memory mapping", dev_name);
            return ERROR_LOCAL;
        } else {
            return errnoexit ("VIDIOC_REQBUFS");
        }
    }

    if (req.count < 2) {
        LOGE("Insufficient buffer memory on %s", dev_name);
        return ERROR_LOCAL;
    }

    buffers = calloc (req.count, sizeof (*buffers));

    if (!buffers) {
        LOGE("Out of memory");
        return ERROR_LOCAL;
    }

    for (n_buffers = 0; n_buffers < req.count; ++n_buffers) {
        struct v4l2_buffer buf;

        CLEAR (buf);
        buf.type        = V4L2_BUF_TYPE_VIDEO_CAPTURE;
        buf.memory      = V4L2_MEMORY_MMAP;
        buf.index       = n_buffers;

        if (-1 == xioctl (fd, VIDIOC_QUERYBUF, &buf))
            return errnoexit ("VIDIOC_QUERYBUF");

        buffers[n_buffers].length = buf.length;
        buffers[n_buffers].start =
        mmap (NULL ,
            buf.length,
            PROT_READ | PROT_WRITE,
            MAP_SHARED,
            fd, buf.m.offset);

        if (MAP_FAILED == buffers[n_buffers].start)
            return errnoexit ("mmap");
    }

    return SUCCESS_LOCAL;
}

(5) 將幀快取加入快取佇列並啟動視訊採集

int startcapturing(void)
{
    unsigned int i;
    struct v4l2_buffer buf;
    enum v4l2_buf_type type;

    for (i = 0; i < n_buffers; ++i) {
        CLEAR (buf);
        buf.type        = V4L2_BUF_TYPE_VIDEO_CAPTURE;
        buf.memory      = V4L2_MEMORY_MMAP;
        buf.index       = i;

        if (-1 == xioctl (fd, VIDIOC_QBUF, &buf))
            return errnoexit ("VIDIOC_QBUF");
    }

    type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    if (-1 == xioctl (fd, VIDIOC_STREAMON, &type))
        return errnoexit ("VIDIOC_STREAMON");

    return SUCCESS_LOCAL;
}

(6) 從快取佇列中取出一幀

int readframeonce(void)
{
    for (;;) {
        fd_set fds;
        struct timeval tv;
        int r;

        FD_ZERO (&fds);
        FD_SET (fd, &fds);

        tv.tv_sec = 2;
        tv.tv_usec = 0;

        r = select (fd + 1, &fds, NULL, NULL, &tv);

        if (-1 == r) {
            if (EINTR == errno)
                continue;

            return errnoexit ("select");
        }

        if (0 == r) {
            LOGE("select timeout");
            return ERROR_LOCAL;

        }

        if (readframe ()==1)
            break;

    }

    return realImageSize;

}
int readframe(void)
{
    struct v4l2_buffer buf;
    unsigned int i;

    CLEAR (buf);

    buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
    buf.memory = V4L2_MEMORY_MMAP;

    if (-1 == xioctl (fd, VIDIOC_DQBUF, &buf)) {
        switch (errno) {
            case EAGAIN:
                return 0;
            case EIO:
            default:
                return errnoexit ("VIDIOC_DQBUF");
        }
    }

    assert (buf.index < n_buffers);

    convert2JPEG(buffers[buf.index].start, buf.bytesused);

    if (-1 == xioctl (fd, VIDIOC_QBUF, &buf))
        return errnoexit ("VIDIOC_QBUF");

    return 1;
}

4. 解碼mjpeg格式

我所使用的usb攝像頭是mjpeg格式,而從網上下載的simplewebcam應用只支援yuyv格式,所以需要重寫解碼模組。

4.1 jni層 - 插入huffman表

安卓自帶的libjpeg解碼庫只能解碼jpeg格式。而mjpeg格式需要在v4l2讀出的幀中找到SOF0(Start Of Frame 0),插入huffman表後就可以用libjpeg庫解碼成rgb。

static int convert2JPEG(const void *p, int size)
{
    char *mjpgBuf = NULL;

    if (pImageBuf == NULL) {
        return errnoexit("pImageBuf isn't initialized in JNI");
    }

    /* Clear pImageBuf and realImageSize */
    memset(pImageBuf, 0, (IMG_WIDTH*IMG_HEIGHT)*2);
    realImageSize = 0;

    /* insert dht data to p, and then save them to pImageBuf */
    realImageSize = insert_huffman(p, size, pImageBuf);

    return SUCCESS_LOCAL;
}

static int insert_huffman(const void *in_buf, int buf_size, void *out_buf) 
{
    int pos = 0;
    int size_start = 0;   
    char *pcur = (char *)in_buf;    
    char *pdeb = (char *)in_buf;   
    char *plimit = (char *)in_buf + buf_size;    
    char *jpeg_buf = (char *)out_buf;    

    /* find the SOF0(Start Of Frame 0) of JPEG */    
    while ( (((pcur[0] << 8) | pcur[1]) != 0xffc0) && (pcur < plimit) ){ 
        pcur++;
    }

    LOGD("pcur: 0x%x, plimit: 0x%x", pcur, plimit);

    /* SOF0 of JPEG exist */
    if (pcur < plimit){
        if (jpeg_buf != NULL)
        {
            /* insert huffman table after SOF0 */
            size_start = pcur - pdeb;
            memcpy(jpeg_buf, in_buf, size_start);
            pos += size_start;
            memcpy(jpeg_buf + pos, dht_data, sizeof(dht_data));
            pos += sizeof(dht_data);
            memcpy(jpeg_buf + pos, pcur, buf_size - size_start);
            pos += buf_size - size_start;
            return pos;
        }
    } else{
        LOGE("SOF0 does not exist");
    }
    return 0;
}

const static unsigned char dht_data[] = {
    0xff, 0xc4, 0x01, 0xa2, 0x00, 0x00, 0x01, 0x05, 0x01, 0x01, 0x01, 0x01,
    0x01, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x01, 0x02,
    0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09, 0x0a, 0x0b, 0x01, 0x00, 0x03,
    0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x00, 0x00, 0x00,
    0x00, 0x00, 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09,
    0x0a, 0x0b, 0x10, 0x00, 0x02, 0x01, 0x03, 0x03, 0x02, 0x04, 0x03, 0x05,
    0x05, 0x04, 0x04, 0x00, 0x00, 0x01, 0x7d, 0x01, 0x02, 0x03, 0x00, 0x04,
    0x11, 0x05, 0x12, 0x21, 0x31, 0x41, 0x06, 0x13, 0x51, 0x61, 0x07, 0x22,
    0x71, 0x14, 0x32, 0x81, 0x91, 0xa1, 0x08, 0x23, 0x42, 0xb1, 0xc1, 0x15,
    0x52, 0xd1, 0xf0, 0x24, 0x33, 0x62, 0x72, 0x82, 0x09, 0x0a, 0x16, 0x17,
    0x18, 0x19, 0x1a, 0x25, 0x26, 0x27, 0x28, 0x29, 0x2a, 0x34, 0x35, 0x36,
    0x37, 0x38, 0x39, 0x3a, 0x43, 0x44, 0x45, 0x46, 0x47, 0x48, 0x49, 0x4a,
    0x53, 0x54, 0x55, 0x56, 0x57, 0x58, 0x59, 0x5a, 0x63, 0x64, 0x65, 0x66,
    0x67, 0x68, 0x69, 0x6a, 0x73, 0x74, 0x75, 0x76, 0x77, 0x78, 0x79, 0x7a,
    0x83, 0x84, 0x85, 0x86, 0x87, 0x88, 0x89, 0x8a, 0x92, 0x93, 0x94, 0x95,
    0x96, 0x97, 0x98, 0x99, 0x9a, 0xa2, 0xa3, 0xa4, 0xa5, 0xa6, 0xa7, 0xa8,
    0xa9, 0xaa, 0xb2, 0xb3, 0xb4, 0xb5, 0xb6, 0xb7, 0xb8, 0xb9, 0xba, 0xc2,
    0xc3, 0xc4, 0xc5, 0xc6, 0xc7, 0xc8, 0xc9, 0xca, 0xd2, 0xd3, 0xd4, 0xd5,
    0xd6, 0xd7, 0xd8, 0xd9, 0xda, 0xe1, 0xe2, 0xe3, 0xe4, 0xe5, 0xe6, 0xe7,
    0xe8, 0xe9, 0xea, 0xf1, 0xf2, 0xf3, 0xf4, 0xf5, 0xf6, 0xf7, 0xf8, 0xf9,
    0xfa, 0x11, 0x00, 0x02, 0x01, 0x02, 0x04, 0x04, 0x03, 0x04, 0x07, 0x05,
    0x04, 0x04, 0x00, 0x01, 0x02, 0x77, 0x00, 0x01, 0x02, 0x03, 0x11, 0x04,
    0x05, 0x21, 0x31, 0x06, 0x12, 0x41, 0x51, 0x07, 0x61, 0x71, 0x13, 0x22,
    0x32, 0x81, 0x08, 0x14, 0x42, 0x91, 0xa1, 0xb1, 0xc1, 0x09, 0x23, 0x33,
    0x52, 0xf0, 0x15, 0x62, 0x72, 0xd1, 0x0a, 0x16, 0x24, 0x34, 0xe1, 0x25,
    0xf1, 0x17, 0x18, 0x19, 0x1a, 0x26, 0x27, 0x28, 0x29, 0x2a, 0x35, 0x36,
    0x37, 0x38, 0x39, 0x3a, 0x43, 0x44, 0x45, 0x46, 0x47, 0x48, 0x49, 0x4a,
    0x53, 0x54, 0x55, 0x56, 0x57, 0x58, 0x59, 0x5a, 0x63, 0x64, 0x65, 0x66,
    0x67, 0x68, 0x69, 0x6a, 0x73, 0x74, 0x75, 0x76, 0x77, 0x78, 0x79, 0x7a,
    0x82, 0x83, 0x84, 0x85, 0x86, 0x87, 0x88, 0x89, 0x8a, 0x92, 0x93, 0x94,
    0x95, 0x96, 0x97, 0x98, 0x99, 0x9a, 0xa2, 0xa3, 0xa4, 0xa5, 0xa6, 0xa7,
    0xa8, 0xa9, 0xaa, 0xb2, 0xb3, 0xb4, 0xb5, 0xb6, 0xb7, 0xb8, 0xb9, 0xba,
    0xc2, 0xc3, 0xc4, 0xc5, 0xc6, 0xc7, 0xc8, 0xc9, 0xca, 0xd2, 0xd3, 0xd4,
    0xd5, 0xd6, 0xd7, 0xd8, 0xd9, 0xda, 0xe2, 0xe3, 0xe4, 0xe5, 0xe6, 0xe7,
    0xe8, 0xe9, 0xea, 0xf2, 0xf3, 0xf4, 0xf5, 0xf6, 0xf7, 0xf8, 0xf9, 0xfa
};

第28-31行,找到SOF0所在的位置,並讓pcur指向它
第39-47行,在SOF0所在的位置之後插入huffman表,也就是dht_data陣列。可被libjpeg解碼的影象最終儲存在pImageBuf中

4.2 jave層 - 解碼並顯示

jni層把影象儲存在pImageBuf,這個buffer對應java層的mImageBuffer。Jave層獲取到影象之後呼叫BitmapFactory.decodeByteArray進行解碼,並通過Canvas顯示影象

@Override
public void run() {
    while (true && cameraExists) {

        ......

        imageSize = processCamera();
        if(imageSize == -1 || imageSize == 0)
            continue;

        bmp = BitmapFactory.decodeByteArray(mImageBuffer.array(), mImageBuffer.arrayOffset(), imageSize);
        if(bmp == null)
            continue;

        Canvas canvas = getHolder().lockCanvas();
        if (canvas != null)
        {
            // draw camera bmp on canvas
            canvas.drawBitmap(bmp,null,rect,null);
            getHolder().unlockCanvasAndPost(canvas);
        }
    }
}

5. 總結

底層配置,只需要使能otg功能並把uvc相關的配置巨集開啟,插入裝置後生成了/dev/videoX裝置節點則說明usb攝像頭列舉並初始化成功了

上層應用,採用網上的開源應用simplewebcam,這個應用只支援yuyv格式,所以需要重寫解碼模組。需要在資料幀中手動插入huffman表之後,才能用android的libjpeg庫來解碼mjpeg格式

另外,在除錯過程中出現了”uvcvideo: Non-zero status (-71) in video completion handler”這樣的log,那是因為mt6735平臺的usb host controller對iso端點的支援不太好,經常出現丟包現象,這個問題需要打上mtk提供的patch才能解決問題