.net中捕获摄像头视频的方式及对比

 随着Windows操作系统的不断演变,用于捕获视频的API接口也在进化,微软提供了VFW、DirectShow和MediaFoundation这三代接口。其中VFW早已被DirectShow取代,而最新的MediaFoundation被Windows Vista和Windows 7所支持。可惜的是,上述接口基于COM技术且灵活性很大,在.net中并不方便直接使用。

  随着Windows操作系统的不断演变,用于捕获视频的API接口也在进化,微软提供了VFW、DirectShow和MediaFoundation这三代接口。其中VFW早已被DirectShow取代,而最新的MediaFoundation被Windows Vista和Windows 7所支持。可惜的是,上述接口基于COM技术且灵活性很大,在.net中并不方便直接使用。

  .net封装

  老外有很多活雷锋,他们奉献了不少的开源项目,DirectShow.net是对DirectShow的封装,而MediaFoundation.net是对MediaFoundation的封装。它们都能在http://sourceforge.net上找到。这两个封装之后的类库基本上跟原来的COM是一一对应的关系,可以用于视频捕获,但是用起来还是不够简便。

  通过不断的google搜索,我认为以下类库对视频捕获封装得不错,它们是:DirectX.Capture、OpenCv、EmguCv和AForge。

  DirectX.Capture

  DirectX.Capture是发表在CodeProject上的一个项目,它能很方便的捕获视频和音频,在窗口预览,并将结果保存到文件。使用DirectX.Capture的示例如下:


  DirectX.Capture
  Capture capture = new Capture( Filters.VideoInputDevices[0],
  Filters.AudioInputDevices[1] );
  capture.Filename = "C:\MyVideo.avi";
  capture.Start();
  //...
  capture.Stop();

  但是,它没有提供单独获取某帧内容的方法。如果您只是需要预览并保存视频,它很好用。

  OpenCv

  OpenCv对VFW和DirectShow的视频捕获部分进行了很好的封装,能够很方便的获取到某帧的内容,也可以将结果保存到视频文件中。使用OpenCv的示例如下:

 


 OpenCv
  IntPtr ptrCapture = CvInvoke.cVCreateCameraCapture(param.deviceInfo.Index);
  while (!stop)
  {
  IntPtr ptrImage = CvInvoke.cvQueryFrame(ptrCapture);
  lock (lockObject)
  {
  stop = stopCapture;
  }
  }
  CvInvoke.cvReleaseCapture(ref ptrCapture);

  不过OpenCv并未对音频捕获进行封装,如果需要同时录制音频,这个搞不定。

  值得注意的是,从OpenCv 1.1开始已经实现了对DirectShow的封装,这跟网上很多人所说的OpenCv使用VFW进行视频捕获效率低下这种观点不一致。关于OpenCv使用DirectShow的论据请看本文的附录。

  EmguCv

  EmguCv是对OpenCv在.net的封装,继承了OpenCv快速的优点,同时它更加好用。使用EmguCv的示例代码如下:


  EmguCv
  Capture capture = new Capture(param.deviceInfo.Index);
  while (!stop)
  {
  PBCapture.Image = capture.QueryFrame().Bitmap;
  lock (lockObject)
  {
  stop = stopCapture;
  }
  }
  capture.Dispose();
  AForge

  AForge是一套纯正的.net开源图像处理类库,它的视频捕获类也是基于DirectShow的,但更加好用,功能更多,从使用和帮助来看更类似微软的类库。


 AForge
  captureAForge = new VideoCaptureDevice(cameraDevice.MonikerString);
  captureAForge.NewFrame += new NewFrameEventHandler(captureAForge_NewFrame);
  captureAForge.Start();
  //...
  captureAForge.SignalToStop();
  private void captureAForge_NewFrame(object sender, NewFrameEventArgs eventArgs)
  {
  PBCapture.Image = (Bitmap)eventArgs.Frame.Clone();
  }

  .net中捕获摄像头视频的方式及对比(How to Capture Camera Video via .Net)

  对比

  介绍完它们之后,我们来比较下它们。它们都是基于DirectShow的,所以性能几乎一样。实际上,我个人认为,摄像头所用的硬件和驱动程序的支持对性能影响更大。我的摄像头在Windows 7下没有专门的驱动程序,只能使用Microsoft提供的默认驱动,性能比WindowsXP要差一截。

  值得注意的是主要有几点:

  (1)只有DirectX.Capture实现了对音频的捕获;

  (2)只有DirectX.Capture不能获取单独的某帧图像;

  (3)EmguCv的免费版基于商业许可,而其他类库的许可都很宽松;

  (4)AForge的示例和帮助比较好,而且功能多些。、

  附录:OpenCv也用DirectShow来捕获视频


通过分析OpenCv 2.0的源代码,我得出了OpenCv使用DirectShow来捕获视频的结论。证据如下:


  DirectShow In OpenCv
  (1)
  //_highgui.h line:100
  #if (_MSC_VER >= 1400 || defined __GNUC__) && !defined WIN64 && !defined _WIN64
  #define HAVE_VIDEOINPUT 1
  #endif
  (2)
  //cVCap_dshow.cpp line:44
  #ifdef HAVE_VIDEOINPUT
  #include "videoinput.h"
  /********************* Capturing video from camera via VFW *********************/
  class CvCaptureCAM_DShow : public CvCapture
  (3)
  //cvapp.cpp line:102
  CV_IMPL CvCapture * cvCreateCameraCapture (int index)
  {
  //.....
  //line:140
  switch (domains[i])
  {
  #ifdef HAVE_VIDEOINPUT
  case CV_CAP_DSHOW:
  capture = cvCreateCameraCapture_DShow (index);
  if (capture)
  return capture;
  break;
  #endif

  本文完整源代码


 using System;
  using System.Collections.Generic;
  using System.ComponentModel;
  using System.Data;
  using System.Drawing;
  using System.Linq;
  using System.Text;
  using System.Windows.Forms;
  using System.Diagnostics;
  using System.Runtime.InteropServices;
  using AForge.Video;
  using AForge.Video.DirectShow;
  using Emgu.CV;
  using Emgu.CV.CvEnum;
  using Emgu.CV.Structure;
  using Emgu.CV.UI;
  using System.Threading;
  namespace ImageProcessLearn
  {
  public partial class FormCameraCapture : Form
  {
  private int framesCaptured; //已经捕获的视频帧数
  private int frameCount;   //需要捕获的总帧数
  private Stopwatch sw;    //计时器
  private VideoCaptureDevice captureAForge = null;  //AForge视频捕获对象
  private bool stopCapture;              //是否停止捕获视频
  private object lockObject = new object();
  public FormCameraCapture()
  {
  InitializeComponent();
  sw = new Stopwatch();
  }
  //窗体加载时,获取视频捕获设备列表
  private void FormCameraCapture_Load(object sender, EventArgs e)
  {
  FilterInfoCollection videoDevices = new FilterInfoCollection(FilterCategory.VideoInputDevice);
  if (videoDevices != null && videoDevices.Count > 0)
  {
  int idx = 0;
  foreach (FilterInfo device in videoDevices)
  {
  cmbCaptureDevice.Items.Add(new DeviceInfo(device.Name, device.MonikerString, idx, FilterCategory.VideoInputDevice));
  idx++;
  }
  cmbCaptureDevice.SelectedIndex = 0;
  }
  }
  //当改变视频设备时,重新填充该设备对应的能力
  private void cmbCaptureDevice_SelectedIndExchanged(object sender, EventArgs e)
  {
  if (cmbCaptureDevice.SelectedItem != null)
  {
  //保存原来选择的设备能力
  Size oldFrameSize = new Size(0, 0);
  int oldMaxFrameRate = 0;
  if (cmbDeviceCapability.SelectedItem != null)
  {
  oldFrameSize = ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize;
  oldMaxFrameRate = ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).MaxFrameRate;
  }
  //清除设备能力
  cmbDeviceCapability.Items.Clear();
  //添加新的设备能力
  int oldCapIndex = -1;  //原来选择的设备能力的新索引
  VideoCaptureDevice video = new VideoCaptureDevice(((DeviceInfo)cmbCaptureDevice.SelectedItem).MonikerString);
  for (int i = 0; i < video.VideoCapabilities.Length; i++)
  {
  VideoCapabilities cap = video.VideoCapabilities[i];
  DeviceCapabilityInfo capInfo = new DeviceCapabilityInfo(cap.FrameSize, cap.MaxFrameRate);
  cmbDeviceCapability.Items.Add(capInfo);
  if (oldFrameSize == capInfo.FrameSize && oldMaxFrameRate == capInfo.MaxFrameRate)
  oldCapIndex = i;
  }
  //重新选择原来的设备能力,或者选一个新的能力
  if (oldCapIndex == -1)
  oldCapIndex = 0;
  cmbDeviceCapability.SelectedIndex = oldCapIndex;
  }
  }
  //当改变设备能力时
  private void cmbDeviceCapability_SelectedIndexChanged(object sender, EventArgs e)
  {
  if (int.Parse(txtRate.Text) >= ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).MaxFrameRate)
  txtRate.Text = ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).MaxFrameRate.ToString();
  }
  //性能测试:测试获取指定帧数的视频,并将其转换成图像,所需要的时间,然后计算出FPS
  private void btnPerformTest_Click(object sender, EventArgs e)
  {
  int frameCount = int.Parse(txtFrameCount.Text);
  if (frameCount <= 0)
  frameCount = 300;
  DeviceInfo device = (DeviceInfo)cmbCaptureDevice.SelectedItem;
  btnPerformTest.Enabled = false;
  btnStart.Enabled = false;
  txtResult.Text += PerformTestWithAForge(device.MonikerString, frameCount);
  txtResult.Text += PerformTestWithEmguCv(device.Index, frameCount);
  txtResult.Text += PerformTestWithOpenCv(device.Index, frameCount);
  btnPerformTest.Enabled = true;
  btnStart.Enabled = true;
  }



 //AForge性能测试
  private string PerformTestWithAForge(string deviceMonikerString, int frameCount)
  {
  VideoCaptureDevice video = new VideoCaptureDevice(deviceMonikerString);
  video.NewFrame += new NewFrameEventHandler(PerformTest_NewFrame);
  video.DesiredFrameSize = ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize;
  video.DesiredFrameRate = int.Parse(txtRate.Text);
  framesCaptured = 0;
  this.frameCount = frameCount;
  video.Start();
  sw.Reset();
  sw.Start();
  video.WaitForStop();
  double time = sw.Elapsed.TotalMilliseconds;
  return string.Format("AForge性能测试,帧数:{0},耗时:{1:F05}毫秒,FPS:{2:F02},设定({3})\r\n", frameCount, time, 1000d * frameCount / time, GetSettings());
  }
  void PerformTest_NewFrame(object sender, NewFrameEventArgs eventArgs)
  {
  framesCaptured++;
  if (framesCaptured > frameCount)
  {
  sw.Stop();
  VideoCaptureDevice video = sender as VideoCaptureDevice;
  video.SignalToStop();
  }
  }
  //EmguCv性能测试
  private string PerformTestWithEmguCv(int deviceIndex, int frameCount)
  {
  Capture video = new Capture(deviceIndex);
  video.SetCaptureProperty(CAP_PROP.CV_CAP_PROP_FRAME_WIDTH, ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize.Width);
  video.SetCaptureProperty(CAP_PROP.CV_CAP_PROP_FRAME_HEIGHT, ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize.Height);
  video.SetCaptureProperty(CAP_PROP.CV_CAP_PROP_FPS, double.Parse(txtRate.Text));
  sw.Reset();
  sw.Start();
  for (int i = 0; i < frameCount; i++)
  video.QueryFrame();
  sw.Stop();
  video.Dispose();
  double time = sw.Elapsed.TotalMilliseconds;
  return string.Format("EmguCv性能测试,帧数:{0},耗时:{1:F05}毫秒,FPS:{2:F02},设定({3})\r\n", frameCount, time, 1000d * frameCount / time, GetSettings());
  }
  //OpenCv性能测试
  private string PerformTestWithOpenCv(int deviceIndex, int frameCount)
  {
  IntPtr ptrVideo = CvInvoke.cvCreateCameraCapture(deviceIndex);
  CvInvoke.cvSetCaptureProperty(ptrVideo, CAP_PROP.CV_CAP_PROP_FRAME_WIDTH, ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize.Width);
  CvInvoke.cvSetCaptureProperty(ptrVideo, CAP_PROP.CV_CAP_PROP_FRAME_HEIGHT, ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize.Height);
  CvInvoke.cvSetCaptureProperty(ptrVideo, CAP_PROP.CV_CAP_PROP_FPS, double.Parse(txtRate.Text));
  sw.Reset();
  sw.Start();
  for (int i = 0; i < frameCount; i++)
  CvInvoke.cvQueryFrame(ptrVideo);
  sw.Stop();
  CvInvoke.cvReleaseCapture(ref ptrVideo);
  double time = sw.Elapsed.TotalMilliseconds;
  return string.Format("OpenCv性能测试,帧数:{0},耗时:{1:F05}毫秒,FPS:{2:F02},设定({3})\r\n", frameCount, time, 1000d * frameCount / time, GetSettings());
  }
  //得到设置所对应的字符串
  private string GetSettings()
  {
  return string.Format("摄像头:{0},尺寸:{1}x{2},FPS:{3}", ((DeviceInfo)cmbCaptureDevice.SelectedItem).Name,
  ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize.Width,
  ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize.Height,
  txtRate.Text);
  }
  //开始捕获视频
  private void btnStart_Click(object sender, EventArgs e)
  {
  //得到设置项
  DeviceInfo cameraDevice = (DeviceInfo)cmbCaptureDevice.SelectedItem;
  Size frameSize = ((DeviceCapabilityInfo)cmbDeviceCapability.SelectedItem).FrameSize;
  int rate = int.Parse(txtRate.Text);
  ThreadParam param = new ThreadParam(cameraDevice, new DeviceCapabilityInfo(frameSize, rate));
  if (rbAForge.Checked)
  {
  captureAForge = new VideoCaptureDevice(cameraDevice.MonikerString);
  captureAForge.DesiredFrameSize = frameSize;
  captureAForge.DesiredFrameRate = rate;
  captureAForge.NewFrame += new NewFrameEventHandler(captureAForge_NewFrame);
  txtResult.Text += string.Format("开始捕获视频(方式:AForge,开始时间:{0})......\r\n", DateTime.Now.ToLongTimeString());
  framesCaptured = 0;
  sw.Reset();
  sw.Start();
  captureAForge.Start();
  }
  else if (rbEmguCv.Checked)
  {
  stopCapture = false;
  Thread captureThread = new Thread(new ParameterizedThreadStart(CaptureWithEmguCv));
  captureThread.Start(param);
  }
  else if (rbOpenCv.Checked)
  {
  stopCapture = false;
  Thread captureThread = new Thread(new ParameterizedThreadStart(CaptureWithOpenCv));
  captureThread.Start(param);
  }
  btnStart.Enabled = false;
  btnStop.Enabled = true;
  btnPerformTest.Enabled = false;
  }
  private void captureAForge_NewFrame(object sender, NewFrameEventArgs eventArgs)
  {
  PBCapture.Image = (Bitmap)eventArgs.Frame.Clone();
  lock (lockObject)
  {
  framesCaptured++;
  }
  }



  //EmguCv视频捕获
  private void CaptureWithEmguCv(object objParam)
  {
  bool stop = false;
  int framesCaptured = 0;
  Stopwatch sw = new Stopwatch();
  txtResult.Invoke(new AddResultDelegate(AddResultMethod), string.Format("开始捕获视频(方式:EmguCv,开始时间:{0})......\r\n", DateTime.Now.ToLongTimeString()));
  ThreadParam param = (ThreadParam)objParam;
  Capture capture = new Capture(param.deviceInfo.Index);
  capture.SetCaptureProperty(CAP_PROP.CV_CAP_PROP_FRAME_WIDTH, param.deviceCapability.FrameSize.Width);
  capture.SetCaptureProperty(CAP_PROP.CV_CAP_PROP_FRAME_HEIGHT, param.deviceCapability.FrameSize.Height);
  capture.SetCaptureProperty(CAP_PROP.CV_CAP_PROP_FPS, param.deviceCapability.MaxFrameRate);
  sw.Start();
  while (!stop)
  {
  pbCapture.Image = capture.QueryFrame().Bitmap;
  framesCaptured++;
  lock (lockObject)
  {
  stop = stopCapture;
  }
  }
  sw.Stop();
  txtResult.Invoke(new AddResultDelegate(AddResultMethod), string.Format("捕获视频结束(方式:EmguCv,结束时间:{0},用时:{1:F05}毫秒,帧数:{2},FPS:{3:F02})\r\n",
  DateTime.Now.ToLongTimeString(), sw.Elapsed.TotalMilliseconds, framesCaptured, framesCaptured / sw.Elapsed.TotalSeconds));
  capture.Dispose();
  }
  //OpenCv视频捕获
  private void CaptureWithOpenCv(object objParam)
  {
  bool stop = false;
  int framesCaptured = 0;
  Stopwatch sw = new Stopwatch();
  txtResult.Invoke(new AddResultDelegate(AddResultMethod), string.Format("开始捕获视频(方式:OpenCv,开始时间:{0})......\r\n", DateTime.Now.ToLongTimeString()));
  ThreadParam param = (ThreadParam)objParam;
  IntPtr ptrCapture = CvInvoke.cvCreateCameraCapture(param.deviceInfo.Index);
  CvInvoke.cvSetCaptureProperty(ptrCapture, CAP_PROP.CV_CAP_PROP_FRAME_WIDTH, param.deviceCapability.FrameSize.Width);
  CvInvoke.cvSetCaptureProperty(ptrCapture, CAP_PROP.CV_CAP_PROP_FRAME_HEIGHT, param.deviceCapability.FrameSize.Height);
  CvInvoke.cvSetCaptureProperty(ptrCapture, CAP_PROP.CV_CAP_PROP_FPS, param.deviceCapability.MaxFrameRate);
  sw.Start();
  while (!stop)
  {
  IntPtr ptrImage = CvInvoke.cvQueryFrame(ptrCapture);
  MIplImage iplImage = (MIplImage)Marshal.PtrToStructure(ptrImage, typeof(MIplImage));
  Image image = new Image(iplImage.width, iplImage.height, iplImage.widthStep, iplImage.imageData);
  pbCapture.Image = image.Bitmap;
  //pbCapture.Image = ImageConverter.IplImagePointerToBitmap(ptrImage);
  framesCaptured++;
  lock (lockObject)
  {
  stop = stopCapture;
  }
  }


sw.Stop();
  txtResult.Invoke(new AddResultDelegate(AddResultMethod), string.Format("捕获视频结束(方式:OpenCv,结束时间:{0},用时:{1:F05}毫秒,帧数:{2},FPS:{3:F02})\r\n",
  DateTime.Now.ToLongTimeString(), sw.Elapsed.TotalMilliseconds, framesCaptured, framesCaptured / sw.Elapsed.TotalSeconds));
  CvInvoke.cvReleaseCapture(ref ptrCapture);
  }
  //停止捕获视频
  private void btnStop_Click(object sender, EventArgs e)
  {
  if (captureAForge != null)
  {
  sw.Stop();
  if (captureAForge.IsRunning)
  captureAForge.SignalToStop();
  captureAForge = null;
  txtResult.Text += string.Format("捕获视频结束(方式:AForge,结束时间:{0},用时:{1:F05}毫秒,帧数:{2},FPS:{3:F02})\r\n",
  DateTime.Now.ToLongTimeString(), sw.Elapsed.TotalMilliseconds, framesCaptured, framesCaptured / sw.Elapsed.TotalSeconds);
  }
  lock (lockObject)
  {
  stopCapture = true;
  }
  btnStart.Enabled = true;
  btnStop.Enabled = false;
  btnPerformTest.Enabled = true;
  }
  //用于在工作线程中更新结果的委托及方法
  public delegate void AddResultDelegate(string result);
  public void AddResultMethod(string result)
  {
  txtResult.Text += result;
  }
  }
  //设备信息
  public struct DeviceInfo
  {
  public string Name;
  public string MonikerString;
  public int Index;
  Guid Category;
  public DeviceInfo(string name, string monikerString, int index) :
  this(name, monikerString, index, Guid.Empty)
  {
  }
  public DeviceInfo(string name, string monikerString, int index, Guid category)
  {
  Name = name;
  MonikerString = monikerString;
  Index = index;
  Category = category;
  }
  public override string ToString()
  {
  return Name;
  }
  }
  //设备能力
  public struct DeviceCapabilityInfo
  {
  public Size FrameSize;
  public int MaxFrameRate;
  public DeviceCapabilityInfo(Size frameSize, int maxFrameRate)
  {
  FrameSize = frameSize;
  MaxFrameRate = maxFrameRate;
  }
  public override string ToString()
  {
  return string.Format("{0}x{1} {2}fps", FrameSize.Width, FrameSize.Height, MaxFrameRate);
  }
  }
  //传递到捕获视频工作线程的参数
  public struct ThreadParam
  {
  public DeviceInfo deviceInfo;
  public DeviceCapabilityInfo deviceCapability;
  public ThreadParam(DeviceInfo deviceInfo, DeviceCapabilityInfo deviceCapability)
  {
  this.deviceInfo = deviceInfo;
  this.deviceCapability = deviceCapability;
  }
  }
  }

猜你喜欢

转载自blog.csdn.net/liangsongjun/article/details/7877390