Monday, October 26, 2009

Surveillance recorder standards

A draft document on the proposed surveillance recorder standards that aims to set interoperability standards among different surveillance recorder manufacturers is out already.  The group spearheading this thrust is called the Physical Security Interoperability Alliance (PSIA) and is composed of major industry players such as GE, Pelco, Cisco, Texas Instruments, etc.

By setting a standard, the group aims to reduce problems related to managing various DVR and NVR brands in one system, which is a typical problem of large organizations which needs to maintain several backend systems to control and manage varying subsystems.

You can read more about the document at:

http://www.psialliance.org/documents/1013RaCMRel.pdf

Internet eyes video monitoring service


There is a new video monitoring company in the UK called Internet Eyes that aims to provide video and alarm monitoring service for companies reportedly using hundreds of "freelance" viewers motivated by a GBP 1200 reward prize.  Looks very promising, but I think there are issues that need to be addressed especially on reliability and accountability.  Also, the one time monthly prize money, may not be a better motivator than small but incremental prizes.  Let's see how it goes.  Their site goes live this November.



An invisibility cloak?

An interesting video on possible technologies that will render someone invisible.



Friday, October 23, 2009

Spoofing IP based CCTV systems

The case against the security of IP based CCTV systems was recently highlighted with the release of a new tool shown at the Defcon hacker conference in Las Vegas.  A security assessment tool, Videojak has the ability to both intercept Internet video feeds and eject false 'looped' images.



Similar in concept to Hollywood's "Mission Impossible" where the attacker can hack, monitor and record a CCTV feed (when nothing is happening), then play this back in a loop to hide the actual live feed (presumably while Tom Cruise is lowered down on a black rope).

This tool also provides the ability to intercept video conferencing.

To be on the safe side, if you are using any kind of conferencing facility you should assume that the information can be intercepted. With the popularity of IP streamed video systems we are not sure if our network is compromised or not.


Thursday, October 22, 2009

It's Friday!




It's beer Friday everyone!  Good day!

CUDA

A major issue found in Network Video Recorders is high CPU utilization.  Unlike in DVRs where encoding and decoding is being done by hardware, receiving and recording multiple video streams, while encoding and decoding them for live view and remote view using just software demands a lot of processing power.  Imagine if there are 32 incoming video streams and resolution size is D1 (720X480), lacking a dedicated hardware encoder and decoder chip,  will heavily tax even the latest PC CPUs available in the market today.  In fact, no present NVR solution can display more than 4 simultaneous D1 streams while recording them at the same time.


CUDA to the rescue

Short of installing another encoder/decoder card, one can use the spare processing power of a PC GPU (Graphic processing unit) for encoding and decoding, as well as add more capabilities to the system.   Last year, Nvidia released the developer kit for CUDA for various developers, to enable applications to take advantage of the powerful NVIDIA GPUs.

What is CUDA?

CUDA (an acronym for Compute Unified Device Architecture) is a parallel computing architecture developed by NVIDIA. CUDA is the computing engine in NVIDIA graphics processing units or GPUs that is accessible to software developers through industry standard programming languages. Programmers use 'C for CUDA' (C with NVIDIA extensions), compiled through a PathScale Open64 C compiler, to code algorithms for execution on the GPU. CUDA architecture supports a range of computational interfaces including OpenCL and DirectCompute. Third party wrappers are also available for Python, Fortran, Java and Matlab.

The latest drivers all contain the necessary CUDA components. CUDA works with all NVIDIA GPUs from the G8X series onwards, including GeForce, Quadro and the Tesla line. NVIDIA states that programs developed for the GeForce 8 series will also work without modification on all future Nvidia video cards, due to binary compatibility. CUDA gives developers access to the native instruction set and memory of the parallel computational elements in CUDA GPUs. Using CUDA, the latest NVIDIA GPUs effectively become open architectures like CPUs. Unlike CPUs however, GPUs have a parallel "many-core" architecture, each core capable of running thousands of threads simultaneously - if an application is suited to this kind of an architecture, the GPU can offer large performance benefits.

In the computer gaming industry, in addition to graphics rendering, graphics cards are used in game physics calculations (physical effects like debris, smoke, fire, fluids); examples include PhysX and Bullet. CUDA has also been used to accelerate non-graphical applications incomputational biology, cryptography and other fields by an order of magnitude or more. An example of this is the BOINC distributed computing client.


CUDA provides both a low level API and a higher level API. The initial CUDA SDK was made public on 15 February 2007, for Microsoft Windowsand Linux. Mac OS X support was later added in version 2.0, which supersedes the beta released February 14, 2008.

GPUs are cheap, massively parallel, programmable compute devices that can be used for many general purpose (non-graphics) tasks. They are a "good fit" for many scientific applications and significant speedups (as compared to contemporary CPUs) have been reported. The CUDA language makes NVIDIA GPUs accessible to developers through a series of extensions to C (with no mention of pixels or shading!).

By harnessing CUDA, it may be possible to have high quality recording simultaneous with multiple high quality video streams for playback and remote view.  It also deloads the CPU of tasks that can help the application run faster and more smoother.  Expect to see a lot of CUDA based NVRs in the future.

Some links:

http://codingplayground.blogspot.com/2009/02/web-search-ranking-and-gpp-many-core.html
http://www.bikal.co.uk/network-video-recorder/nvr-pro.html

Monday, October 19, 2009

ARMed and ready!




We are right in the midst of an on-going battle which I am sure most haven't noticed.  You may have heard about the Intel vs AMD wars in 90's, but have you heard about ARM processors? This is the processor of choice for all our embedded devices and probably for your phone as well and probably has a larger impact in the tech industry as a whole.   Why choose arm?  It can be best answered by learning more about the chip.

From wikipedia:

"The ARM is a 32-bit reduced instruction set computer (RISC) instruction set architecture (ISA) developed by ARM Limited. It was known as the Advanced RISC Machine, and before that as the Acorn RISC Machine. The ARM architecture is the most widely used 32-bit ISA in terms of numbers produced. They were originally conceived as a processor for desktop personal computers by Acorn Computers, a market now dominated by the x86 family used byIBM PC compatible computers. But the relative simplicity of ARM processors made them suitable for low power applications. This has made them dominant in the mobile and embedded electronics market as relatively low cost and small microprocessors and microcontrollers."


Surprisingly, when someone asks who the giants of the semiconductor industry are, the names of Intel's founders, Gordon Moore and Robert Noyce, float to the top of my mind. And of course, there is the colorful and controversial William Shockly, who founded Shockly Semiconductor Labs, the first major semiconductor lab in Silicon Valley. And then there is Jack Kilby of Texas Instruments, who along with Robert Noyce is credited with inventing the integrated circuit.  The person behind ARM is Dr. Robin Saxby; or to be more specific, Sir Robin Saxby, the founder of Advanced RISC Machines LTD (ARM) of Cambridge, England.

How big is the ARM market?

"As of 2007, about 98 percent of the more than a billion mobile phones sold each year use at least one ARM processor. As of 2009, ARM processors account for approximately 90% of all embedded 32-bit RISC processors. ARM processors are used extensively in consumer electronics, including PDAs, mobile phones, iPods and other digital media and music players, hand-held game consoles, calculators and computer peripherals such as hard drives and routers."

"Consider: Intel sold its 1 billionth x86 chip in 2003. Its closest rival, AMD, broke the 500 million mark just this year. ARM, on the other hand, expects to ship 2.8 billion processors in 2009 alone -- or around 90 chips per second. That's in addition to the more than 10 billion ARM processors already powering devices today.


Pick up any mobile phone and there's a 95 percent chance it contains at least one ARM processor. If the phone was manufactured in the past five years, make that 100 percent; that goes for standard handsets as well as smartphones.
The same is true for portable media players. Whether the label says Archos, iRiver, or Sony, inside it's ARM."


Thus, ARM is a giant in the tech arena and most surprising is that folks in the U.S. haven't heard of  the almost ubiquitous ARM processor. Almost ubiquitous because the ARM processor in some form or another is at the core of almost every cell or smartphone sold today, as well as being used in billions of embedded devices around the world.  Until now these processors have mostly had life in embedded devices, cell phones and most recently in smartphones. They are the main processors used in many of the Linux-based operating systems on the market today, which you'll find in cell and smartphones, such as Apple'siPhone, and in DVD players, GPS systems, smart TVs, set-top boxes, etc.

However, we are starting to see ARM processors move into some netbook models.The one with the highest profile has been the ARM-based Qualcomm SnapDragon processor that will reportedly be used in an HP netbook somewhere down the line. And the ARM processor is also at the heart of Nvidia's Tegra processor that, like the SnapDragon, is heavily targeted at netbooks, and especially netbooks aimed at being sold through telecom providers like Verizon Wireless, AT&T, etc. They are even showing up in servers with low-voltage requirements.ARM Holdings, LTD, which is the parent company, does a lot of R&D of its own, but its real business is that of IP licensing. The ARM core is licensed to six public ARM silicon partners that are specifically taking aim at netbooks and smartbooks. These partners are Marvel, Qualcomm, Freescale, Samsung, Nvidia, and Texas Instruments.In emerging markets, Linux-based systems could do well. And over time, Linux-based systems based on an OS like Android could actually gain ground with carriers that want to sell subsidized netbooks in stores, and want an OS that offers more control of the users' experience and service offerings.

This is why, we chose to use ARM in our embedded devices.  Lock and load!

References:

www.wikipedia.org
www.pcmag.com
http://www.computerworld.com/s/article/9140039/ARM_vs._Atom_The_battle_for_the_next_digital_frontier?taxonomyId=15

Friday, October 16, 2009

Stereoscopic rangefinder



One of the projects I was mulling on before was a stereo rangefinder using two PTZ cameras.  The concept is simple, when an object is being viewed by two cameras, its distance can easily be calculated.  Since the distance between two PTZ cameras is known, the distance of the object can be derived trigonometrically.  This concept is still being used by existing optical rangefinders despite the presence of accurate laser rangefinders.

During World War II, Allied and German tanks slaved their optical rangefinders to the tank main gun.  To operate these rangfinders, the gunner needs to merge the two views into one object and the gun automatically is set to fire in the predetermined range.  To illustrate:



sin B=b/c
c= b/sin B

  c^2   =   a^2 + b^2 
      
  a=  sqrt (c^2 - b^2)

Reference to such system is shown below:

"Using stereoscopic images to determine the range to objects isn't a new phenomena. Before the age of the laser manual stereo correspondence was used on ships and also amusingly on battle tanks. This web page describes the use of a stereo range finder in an M48 Patton tank as follows:
When first deployed, the M48-A3 had for the 1960s a state-of-the-art fire control system. At the time computers were mechanical, and range to the target was provided by a stereoscopic range finder, which functioned similarly to a 35mm camera. An end-box on each side of the turret exterior held a prism-type mirror. Turning a hand-crank on the range finder would pivot these mirrors until the double-image in the range finder merged. As the distance between the mirrors is exactly known, a little trigonometry provided the range (in meters) to the target. This information was displayed on a range indicator, and also fed to the ballistic computer by a rotating shaft. The ballistic computer was a collection of gears and cams--nothing was solid-state--which had a handle so that the gunner could select the type of ammunition that was to be fired. Each round had a different muzzle velocity, and therefore the computer had a different cam for each type. The computer would take the range data, merge it with the velocity data, and via a set of rotating shafts, supply this information to the gun's super-elevation mechanism, resulting in the gun being elevated above the gunners line of sight sufficiently for the round to overcome the downward pull of gravity on its way to the target.
The stereo range finder was effectively a dual lateral periscope, with the sights protruding from either side of the metallic monster as shown below. Using the stereo ranger was referred to as "flying the geese" due to the appearance of the markings which needed to be optically aligned."


With the advent of computers and video intelligence, the process of aligning the images and getting the range of a certain object can be automated and derived with ease.  Practical applications for such a system is however limited.

Wednesday, October 14, 2009

DVR standards





The market demand for the best,  fastest and highest quality products push us to continuously develop new and better products.  From the old analog tape based video recorders, the industry standard is now digital video recorders, storing compressed TV quality videos (CIF) at 30 frames per second.  Compression standards have likewise improved over the years, starting with Motion JPEG, then MPEG-4 and now H.264 (also known as MPEG-4 part 10).    Positioning one's product above this standard, at the lowest possible price, is the sweet spot for most manufacturers.  Some products sell just on low price alone and totally forget about quality and performance, while some offer the best performance and quality at prohibitive prices.  What is the sweet spot in positioning?  What is the minimum standard?  What is the best price? 

In my opinion, a decent DVR should have the following features:

  • Real-time recording (30 frames per second - Note: Movies run at 24 frames per second) on all cameras for at least CIF size
  • MPEG-4 compression
  • Recording capability of up to D1 size (better known as DVD quality)
  • Network capability (This includes video streaming thru LAN or the internet, remote setup and remote viewing)
  • At least triplex operation (Simultaneous operation for recording, search and playback and live view)
Some ultra cheap designs are available in the market but are lacking in some basic features, which goes to show that there is no such thing as "free lunch :)".  What are the possible downside to missing out some of the basic features?


  • Missing out important frames due to frame rate limitations.  Imagine split second action that results to damaged property, missing goods or worse, lost lives.  You don't want your video evidence to leave whodunit questions unanswered.

  • Why compression?  Yes, storage is cheap, but it shouldn't be a reason why you should waste money on extra storage.   Uncompressed video takes a huge amount of storage space, regardless of whether you're storing it on a hard drive, DVD or digital tape. To imagine how much space is required, consider that a typical uncompressed still frame of video, at the quality most of us are used to viewing, requires just under one megabyte to store. Video typically plays at 30 frames per second. This means that your typical uncompressed video might occupy 27 megabytes per second to store. Do a little more math, and you'll soon discover that the new 80 gigabyte hard drive that came with your computer will only store about 50 minutes of raw, uncompressed video. Do one more calculation and you'll see that a DVD disc (at 4.5GB) can hold less than three minutes. Clearly, we need some form of digital compression to reduce that file size.  The second issue is so closely related to the first that it's really the same problem viewed from another angle. Imagine you have an uncompressed VHS-quality video file sitting on a hard drive, ready to play. In order to provide smooth playback, your hard drive would have to dump data to your computer at a sustained 27 megabytes per second (or, as an engineer would think: 216 megabits per second [27 x 8 bits/byte]). Storage systems are available that can hit these speeds, but they're very expensive. Now consider that you want to deliver that same video to the masses, via the Internet. Whatever technology you use, the speed of that technology (bandwidth) would have to match that 27 megabytes per second, without fail.  
  • D1 capability is needed if you want that extra resolution to positively identify objects or people which otherwise would have been unrecognizable.  It's better to have a snapshot of a recognizable face rather than a barely recognizable blob, especially at a distance.

  • Network capability?  Remote monitoring and remote setup capabilities add a factor of more than 100 to the useability and convenience of using a DVR.  Imagine, visiting each and every DVR, especially for multi-site installations.  
  • Multiplex operation?  This is the most overlooked feature in DVRs.  You don't want recording to stop, just because somebody is viewing via the network or somebody is adding a user to the DVR.


Tuesday, October 13, 2009

Building a DVR


How do we build a DVR from scratch?   Here is a mini-guided tour to our R&D facility.  

The process starts with a list of needed specifications or features.  From these requirements, two teams will collaborate in the design and production of the DVR, namely hardware and software.  


Depending on the requirement, hardware will design the motherboard only or including the case.  A block diagram is the first order of business to based on all the spec requirement.  Following the block diagram would be a schematic diagram (see left illustration).  The whole process takes around ten (10) working days or less to complete.


     Sample schematic diagram

After the schematics, the printed circuit board (PCB) will be designed based on the schematic diagram.  The process takes around 2 days to complete.    For the casing, the mechanical team will produce the mock up prior to producing the engineering sample.


PCB board design



Mechanical

After each and every stage, a quality control staff checks and monitors the process and output as per established company standards.  Next process is PCB production.

The finished PCBs will be tested and a few samples will undergo surface mount device (SMD) assembly.

 
After assembly completion, the finished board is tested for defects.  The few samples produced will be used for the prototype and subsequent engineering samples for further testing.  

Part of the finished engineering samples will then be turned over to software for testing and driver and application development.



After completing the driver and applications, the finished product will undergo extensive product testing prior to the release of the commercial version. This will include useability testing, aging test, feature and function testing.

Some of our sample products are shown below: