SlideShare a Scribd company logo
1 of 7
Download to read offline
Journal for Research| Volume 02| Issue 04 | June 2016
ISSN: 2395-7549
All rights reserved by www.journalforresearch.org 39
Design and Implementation of Camera-Based
Interactive Touch Screen
Dr. Siny Paul Vishnu P
Associate Professor UG Student
Department of Electrical and Electronics Engineering Department of Electrical and Electronics Engineering
Mar Athanasius College of Engineering, Kothamangalam Mar Athanasius College of Engineering, Kothamangalam
Susmitha James Sona Jo Madhavathu
UG Student UG Student
Department of Electrical and Electronics Engineering Department of Electrical and Electronics Engineering
Mar Athanasius College of Engineering, Kothamangalam Mar Athanasius College of Engineering, Kothamangalam
Abstract
Camera-based Interactive Touch Screen is a touch detection technique that uses a camera to provide a large display with very
high spatial and temporal resolutions. The conventional touch screen technology and presentation methods face a range of
restrictions. However, the camera-based touch detection can overcome all these restrictions and turn projection screens into
interactive touch displays, creating a through-window experience. It uses a coated sheet of glass as the projection surface to form
a two-dimensional display. The camera captures images of the projection surface continuously, which are processed by the
Atmega16 microcontroller. A UART module connected to the microcontroller, provides asynchronous serial communication
with external devices, synchronisation of the serial data stream and recovery of data characters. This technology has several
advantages over other touch detection technologies, such as its low cost, simple design and scalable structure. The applications
of this technology include advertising, presentations and outdoor displays.
Keywords: Camera, Coated Glass, Presentation, Projector, Touch Screen, UART Module
_______________________________________________________________________________________________________
I. INTRODUCTION
Since the early 1970s, touch screen technologies have become common in many everyday applications. Ranging from ATM
machines, to PDAs, to the first commercially available tablet PC developed by IBM in 1993, touch screen technologies find a
wide range of uses in our lives. Despite the rapid improvements in this field, it is still difficult to construct an inexpensive touch
screen sensor that can register multiple simultaneous points of contact.
Past attempts to create touch screens have focused on the use of individual sensors. A touchscreen is an input device normally
layered on the top of an electronic visual display of an information processing system. A user can give input or control the
information processing system through simple or multi-touch gestures by touching the screen with a special stylus and/or one or
more fingers. Some touchscreens use ordinary or specially coated gloves to work while others use a special stylus/pen only. The
user can use the touchscreen to react to what is displayed and to control how it is displayed; for example, zooming to increase the
text size.
Touchscreens are common in devices such as game consoles, personal computers, tablet computers, electronic voting
machines and smartphones. They can also be attached to computers or, as terminals, to networks. They also play a prominent
role in the design of digital appliances such as personal digital assistants (PDAs) and some books (E-books).
The touchscreen enables the user to interact directly with what is displayed, rather than using a mouse, touchpad, or any other
intermediate device (other than a stylus, which is optional for most modern touchscreens). The development of multipoint
touchscreens facilitated the tracking of more than one finger on the screen; thus, operations that require more than one finger are
possible. These devices also allow multiple users to interact with the touchscreen simultaneously.
With the growing use of touchscreens, the marginal cost of touchscreen technology is routinely absorbed into the products that
incorporate it and is nearly eliminated. Touchscreens now have proven reliability. The ability to accurately point on the screen
itself is also advancing with the emerging graphics tablet/screen hybrids.
The objectives of the project are to realise a touch screen using a camera and a coated glass sheet, to develop an interactive
and user-friendly method of presentation, to increase the size of touch-screen beyond the capabilities of existing technologies and
to overcome their drawbacks.
II. CAMERA-BASED TOUCH DETECTION
Camera-based touch detection systems can be implemented using different methods. Here, we use the laser light plane technique
as its immunity to noise is much higher than the shadow detection technique.
Design and Implementation of Camera-Based Interactive Touch Screen
(J4R/ Volume 02 / Issue 04 / 008)
All rights reserved by www.journalforresearch.org 40
The information flows in two ways: there are commands from the computer that changes the different characteristics of the
camera, and images from the camera that are sent to the computer. The communication between the computer and the AVR is
implemented through the serial port and the communication between the camera and the microcontroller is done using the I2C
protocol that can access the different registers of the camera. An 8 bit port is used to read the images.
Fig. 1: Block Diagram
The surface is illuminated by single or multiple lasers. The laser plane of light is about 1mm thick and proximally positioned
to the touch surface. The laser module has a built-in driver. The display is projected on to the coated glass screen by the projector
placed behind the screen. The camera behind the screen continuously captures the images. The camera transmits the images as
RGB data to the microcontroller. The microcontroller communicates with the system through a UART module [1]-[4].
Camera-Microcontroller Interface
The camera-microcontroller interfacing forms the main part of the project. A C3088 color camera module with digital output is
used. It uses a CMOS image sensor OV6620 from Omnivision. It has a digital video port that supplies a continuous 8/16 bit-wide
image data stream. All the camera functions, such as exposure, gamma, gain, white balance, windowing, can be changed through
I2C interface by writing in some registers.
The video output can be expressed in different formats, and with different type of channels (RGB, UVY). The information is
sent continuously and is synchronized by the HREF, VSYNC and PCLK signals as shown in Figure.
The VSYNC signal indicates new frame, the HREF signal indicates if the information is valid and makes the horizontal
synchronization, and PCLK is the clock signal (the data is valid in the rising edge). The period of PCLK can be changed by
writing in the registers of the camera. This will allow to read images from the camera directly with the microcontroller without
the use of additional hardware. The registers accessed by the I2C bus allow to change different properties of the camera. In this
project, they are used to change the period of PCLK, to read the size of the image, to make a mirror of the image, and to reset the
camera.
As the initial frequency of PCLK is 17.73 MHz and the AVR is not fast enough to read each pixel at this frequency, there are
two possible solutions:
 Use additional hardware to read and store the image.
 Decrease the frequency of PCLK by writing in the register 0x11.
We use the last method. The frequency taken to read the image depends on the way we read the image. If we read the image
by horizontal lines we need to put the lowest frequency allowed which is 69,25KHz. This lets us read one line that is
simultaneously stored in the memory of the AVR. On the other hand, if we read a vertical line of the image in each frame, a
higher frequency of 260 KHz is used, but we need to read as many frames as vertical lines to get a complete image. In the case of
the horizontal line reading, the resulting image is too bright, and that is the reason why the vertical reading is used, even if we
need to read as many frames as vertical lines.
Design and Implementation of Camera-Based Interactive Touch Screen
(J4R/ Volume 02 / Issue 04 / 008)
All rights reserved by www.journalforresearch.org 41
Fig. 2: AVR-camera interfacing
The horizontal reading is used to read one horizontal line and make a little image process of it. The selection of this frequency
was made experimentally trying to use the highest as possible frequency. When an image is to be sent to the computer, the
headers and the palette are sent to the computer and then we proceed to reading the image from the camera. We read each frame
and column and send the data pixel by pixel through the serial port to the computer [5].
I2C Protocol
Inter-Integrated Circuit, abbreviated as I2C, is a serial bus short distance protocol developed by Philips Semiconductor about two
decades ago, to enhance communication between the core on the board and various other ICs involved around the core.
The most popular serial bus communication protocols available today in the market are, SPI, UART, I2C, CAN, USB,
IEE1394, and so on. Philips originally developed I2C for communication between devices inside a TV set. Examples of simple
I2C-compatible devices found in embedded systems include EEPROMs, thermal sensors, and real-time clocks. I2C is also used
as a control interface for signal processing devices that have separate, application-specific data interfaces. Philips, National
Semiconductor, Xicor, Siemens, and other manufacturers offer hundreds of I2C-compatible devices. I2C buses can typically
reach speeds up to 400 Kbps.
I2C is appropriate for interfacing devices on a single board, and can be stretched across multiple boards inside a closed
system. An example is a host CPU on a main embedded board using I2C to communicate with user interface devices located on a
separate front panel board. I2C is a two-wire serial bus. The chip select or arbitration logic is not required, making it cheap and
simple to implement hardware. The two important I2C signals are serial data and serial clock. Together, these signals make it
possible to support serial transmission of 8-bit data. i.e. 7-bit device addresses plus control bits-over the two-wire serial bus.
An I2C slave can hold off the master in the middle of a transaction using what's called clock stretching (the slave keeps SCL
pulled low until it's ready to continue). Most The I2C protocol can also support multiple masters. There may be one or more
slaves on the bus. Both masters and slaves can receive and transmit data bytes.
Each I2C-compatible hardware slave device comes with a predefined device address, the lower bits of which may be
configurable at the board level. The master transmits the device address of the intended slave at the beginning of every
transaction. Each slave is responsible for monitoring the bus and responding only to its own address. This addressing scheme
limits the number of identical slave devices that can exist on an I2C bus without contention, with the limit set by the number of
user-configurable address bits.
The I2C signaling protocol provides device addressing, a read/write flag, and a simple acknowledgement mechanism. Other
elements of I2C protocol are general call (broadcast) and 10-bit extended addressing. Standard I2C devices operate up to
100Kbps, while fast-mode devices operate at up to 400Kbps. Most often, the I2C master is the CPU or microcontroller in the
system. Some microcontrollers even feature hardware to implement the I2C protocol. You can also build an all-software
implementation using a pair of general-purpose I/O pins. Since the I2C master controls transaction timing, the bus protocol
doesn't impose any real-time constraints on the CPU beyond those of the application. For a fixed I2C the high and low logics are
defined at 3.0 V and 1.5 V. For dependant I2C, these are defined at 0.7*Vdd and 0.3*Vdd respectively. The pull-up resistor values
required for I2C are typically at 1K for 3.0V of Vdd and 1.6K for 5V of Vdd. Typical operating temperatures are between -40
degrees and +85 degrees Centigrade [6].
Design and Implementation of Camera-Based Interactive Touch Screen
(J4R/ Volume 02 / Issue 04 / 008)
All rights reserved by www.journalforresearch.org 42
Operation
Laser light plane (LLP) technique used here is one of the newest illumination and tracking methods for surface touches. As
opposed to the other optical sensing techniques that employ several arrays of LEDs to generate radiation, the LLP method uses
commonly available laser diodes paired with line generating lenses. By placing several lasers in the corners of the multitouch
device, a light plane is generated that will cover the whole active surface area. Depending on the size of the device, 2 to 8 lasers
are commonly employed. Generally, two lasers placed infront of the screen would generate enough radiation to extend the light
plane to the required dimensions. Since the light plane can be blocked by placing a finger or an object there is a need for an
illumination redundancy where even the hidden markers receive sufficient lighting in order to scatter the radiation towards the
sensing element.
The laser plane is 1 mm in thickness and is positioned infront of the surface at a distance varying from 0.5 to 2 mm. When an
object comes in contact with the laser plane, radiation is scattered towards the camera that acts as the radiation detector. The
common laser wavelength used in this method is 635 nm for the increased availability of optical filters and greater environment
immunity.
The display is projected on to a coated glass screen by the projector placed behind the screen. The camera behind the screen
continuously captures the images. .When a finger touches the light plane, it gets lighted up by an arrangement of plane red laser
light modules infront of the screen. Thus, the point of touch is viewed by the camera as an intense red spot. The camera transmits
the images as RGB data to the microcontroller which calculates the x and y coordinates of the point with highest intensity of red.
The data is then sent to the system through a UART module to produce the required response. Thus touch is generated on the
glass surface [7-11].
Fig. 3: Laser Light Plane Technique
III. HARDWARE DESCRIPTION
The various hardware components used in the project are:
Coated Glass
The sheet of glass used as the touch screen is coated with a layer of acrylic, so as to form an image on the screen. Acrylic is a
useful, clear plastic that resembles glass. Common brands of acrylic include polycast, lucite, plexiglass or optix. It is available in
a variety of thicknesses, colours and niche. The most popular uses for acrylic signage include facility signs and interior
dimensional lettering. It can be printed, routered, engraved, beveled, polished, painted, cut, glued, bent, drilled and lasered.
Acrylic does not blast well[12].
Camera C3088
The C3088 is a 1/4” colour camera module with digital output. It uses OmniVision’s CMOS image sensor OV6620. Combining
CMOS technology together with an easy to use digital interface makes C3088 a low cost solution for higher quality video image
application. The digital video port supplies a continuous 8/16 bit -wide image data stream. All camera functions, such as
exposure, gamma, gain, white balance, color matrix, windowing, are programmable through I2C interface.
Atmega16 microcontroller
The ATmega16 is a low-power CMOS 8-bit microcontroller based on the AVR enhanced RISC architecture. By executing
powerful instructions in a single clock cycle, the ATmega16 achieves throughputs approaching 1 MIPS per MHz allowing the
system designer to optimize power consumption versus processing speed.
Design and Implementation of Camera-Based Interactive Touch Screen
(J4R/ Volume 02 / Issue 04 / 008)
All rights reserved by www.journalforresearch.org 43
Plane Laser Light Module
These are encapsulated laser diodes of Class IIIa 5mW, with a 650nm red wavelength. They can be driven from 2.8V to 5.2V.
This particular laser diode has a lens attached to turn the dot into a line. The line beam spread is 120°. The line fades out and
doesn't have a sharp cut-off.
It is not a laser pointer, but a diode with an integrated driver and requires a 3 V to 5 V DC power supply. It has a 5mW output,
and a safety label marked on it.
UART Module
A UART (Universal Asynchronous Receiver/Transmitter) is the microchip with programming that controls a computer's
interface to its attached serial devices. Specifically, it provides the computer with the RS-232C Data Terminal Equipment (DTE)
interface so that it can "talk" to and exchange data with modems and other serial devices.
IC Voltage Regulator
Voltage regulator ICs are available with fixed (typically 5, 12 and 15V) or variable output voltages. They are also rated by the
maximum current they can pass. Negative voltage regulators are available, mainly for use in dual supplies. Most regulators
include some automatic protection from excessive current ('overload protection') and overheating ('thermal protection'). Many of
the fixed voltage regulator ICs has 3 leads and look like power transistors, such as the 7805 +5V 1Amp regulator. They include a
hole for attaching a heat sink.
Projector
A projector or image projector is an optical device that projects an image (or moving images) onto a surface, commonly a
projection screen. Most projectors create an image by shining a light through a small transparent lens, but some newer types of
projectors can project the image directly, by using lasers.
Projector is a highly versatile tool for presentation. Here we use a projector (BENQ/EPSON) to project the computer screen to
the plane glass with the help of thin acrylic sheet.
PC/Laptop
We do not need to build a computer specifically for this project. In fact, any recent computer with a decent processor and
graphics card should work well. However, if you plan on putting the surface in an enclosure when you are done, then ensure the
computer has adequate cooling and minimal thermal dissipation.
IV. EXPERIMENTAL SETUP AND RESULT
Various parts of the system are constructed and connected together to work in unison. The mechanical parts do not interact with
the electrical parts, which simplifies the design. The main requirement of the mechanical parts is to hold the system together
while allowing adjustments to be made regarding the positioning of the components relative to one another. The main parts that
need to be held in place are: the screen, the LEDs, the projector and the camera. The screen size settled on was 30” diagonal and
had an aspect ratio of 4:3.
The final product needs to be sturdy, adjustable, upgradeable and portable. The system should be able to work both standing
up, which is easiest for a single user, and also lying down to accommodate a large group of people.
Hardware Implementation
The three main components of our hardware design are as follows:
 Laser module
 Camera and its associated circuitry
 Outer casing for the entire device.
Fig. 4: Plane Laser Light Module
Design and Implementation of Camera-Based Interactive Touch Screen
(J4R/ Volume 02 / Issue 04 / 008)
All rights reserved by www.journalforresearch.org 44
Laser Module:
Our original plan, at the time of the project proposal, was to use an infrared laser to detect button presses using the CMOS
camera, but we realized that user safety would be a major issue in that case. The user would never know even if he/she is staring
directly at the laser and, therefore, there would be no way to prevent eye damage. In addition, we also realized that the CMOS
camera we’re using (OV6630) is not very effective at detecting infrared light. Hence, we decided to use a Class IIIa 635 nm red
laser instead (Fig. 4).
The laser module we bought came with a built-in driver; therefore, we didn’t have to worry about biasing the laser properly to
make it operational. All we had to do was to connect the laser to a 3V power source, which we obtained using a simple 3V
voltage regulator.
The laser module also came with a line-generating diffractive optical element attached to it. However, since we didn’t know
the fan-angle for this DOE, we had to experiment with various distances in order to obtain a line length of at least 8.5”, which
was required to cover the entire screen. In the end, we had to place the laser at a distance of approximately 12.5” to obtain good
results.
Camera and its Associated Circuitry:
We decided to use the C3088 1/4” color sensor module with digital output, which uses OmniVision’s CMOS image sensor
OV6630. The two primary reasons why we chose this specific camera module were its low cost and the fact that it is capable of
outputting image color data in progressive scan mode. Progressive scanning was an important consideration for us since we don’t
have enough computational power available on the 16Mhz Atmega16 microcontroller to process entire frames at once; however,
we can certainly process images line-by-line as they come in. After rigorous testing, we realized that we could work with only
the red channel data from the camera. Hence, we connected the 8-bit red channel output from the camera (UV[7:0]) to
PORTA[7:0] on the Mega16. We decided to use a resolution of 176x144. At this resolution, we could capture at most 6 frames
of color images per second.
The camera output format was set to capture 16 bit UV/Y data, where UV had BRBR data and Y had GGGG data. The Y data
was completely ignored.
Outer Casing for the Entire Device:
The hardware assembly for our device is designed to hold the camera at a fixed position.
Software Implementation
The software component was split into 5 main components:
1) Implementing the I2C protocol to read and write registers from camera
2) Reading values from camera to obtain 6 frames every second
3) Processing the images
4) Sending serial data to update the array of scan codes in the Mega16Camera and its associated circuitry
At first we initialize PORTA on the Mega16 to take UV input from the camera and PORTC to communicate with the camera
over the I2C interface. The baud rate is set to 19,200bps for serial communication. We then run the calibrate function on the
camera. Then we call a function called "init_cam" which performs a soft reset on the camera before writing the required values
to corresponding camera registers. These registers change the frame size to 176x144, turn on auto white balance, set the frame
rate to 6 fps, and set the output format to 16-bit on the Y/UV mode with Y=G G G G and UV = B R B R. The code then enters
an infinite loop which checks for the status of the PS2 transmitting queue and tries to process the next captured frame if the
queue is empty. If not, the queue is updated and the PS2 transmission is allowed to continue. The microcontroller captures rows
of data from the camera and processes each of them. We first wait for a negative edge on the VSYNC, which indicates the arrival
of new frame data on the UV and Y lines. Then, we wait for one HREF to go by since the first row of data is invalid. At this
point, we can clock in 176 pixels of data for a given vertical line.
In the mode where the UV line receives BR data, the output is given by: B11 R22 B13 R24 and so on. Since we only needed
red data, we keep an array of 88 values in which we store the data on the stays negative for about 0.8ms and the camera data
becomes invalid UV line every 2 PCLKS. The OV6630 also repeats the same set of pixels for consecutive rows and thus 2
vertical lines processed would have data about the same pixels. Since we don’t have enough memory to store entire frames of
data to process, we do the processing after each vertical line. After each vertical line of valid data, HREF; this gives us ample
time to process one line worth of data.
After each vertical line was captured, we looped through each pixel to check if it exceeded the red threshold found during
calibration. For every pixel that met this threshold, we then checked if the pixel was part of a contiguous line of red pixels, which
would indicate a touch. If such a pixel was found, we then mapped this pixel to a scan code by binary searching through an array
of x, y values. If this scan code was found to be valid, we calculated the time for which the red pixel remained stationary, and
then decided the action to be performed. i.e. mouse-click or pointer movement.
Experimental Setup
The whole system is arranged as shown in the Fig. 5 and 6.
Design and Implementation of Camera-Based Interactive Touch Screen
(J4R/ Volume 02 / Issue 04 / 008)
All rights reserved by www.journalforresearch.org 45
Fig. 5: Hardware Setup Fig. 6: Display Screen
Result
The project was tested successfully in the presence of teachers and students. The hardware setup was done and the project was
tested for controlling the mouse pointer and performing the 'left-click' action.
The future developments that can be incorporated are:
 Developing a stylus pen for 'write' and 'copy-paste' actions.
 Upgrading the single touch screen to a multi-touch one.
V. CONCLUSION
Touch technology has come a long way in the last decade. Just six years ago, most phones used traditional keypads; today,
almost all smartphones have a touchscreen, and the technology has spread to tablets, handheld consoles and laptops as well.
Although camera-based touch detection systems appear large and complicated, they are extremely useful in cases where the
size of the display is very large.
The system has several advantages such as absence of a compliant surface and LED frames, can use any transparent material
as the projection surface, is slightly cheaper than other technologies and has a simple setup. However, it has many disadvantages
including inability to track traditional objects and fiducials, insensitivity to pressure and may cause occlusion (in case of multi-
touch) when only one laser is used. i.e. light hitting one finger blocks another finger from receiving light.
The applications of the project include user-friendly presentation boards, advertising screens, route maps etc.
REFERENCES
[1] Jain, Anjul, Diksha Bhargava Bhargava, and Anjani Rajput. "Touch-screen Technology." International Journal of Advanced Research in Computer Science
and Electronics Engineering (IJARCSEE) 2.1 (2013): pp-074.
[2] Rosin, Hanna. "The touch-screen generation." The Atlantic 20 (2013).
[3] J. Hu, G. Li, X. Xie, Z. Lv and Z. Wang, "Bare-fingers Touch Detection by the Button's Distortion in a Projector–Camera System," in IEEE Transactions
on Circuits and Systems for Video Technology, vol. 24, no. 4, pp. 566-575, April 2014.
[4] Katz, Itai, Kevin Gabayan, and Hamid Aghajan. "A multi-touch surface using multiple cameras." Advanced Concepts for Intelligent Vision Systems.
Springer Berlin Heidelberg, 2007.
[5] Kaur, Ramandeep. "To develop an interface among C3088 camera and computer using AVR microcontroller.",unpublished
[6] Hu, ZhenMuzaffar, Junaid. "Inter-Integrated-Circuit (I2C)." (2005).
[7] Gwang Jun Lee, Sang Kook Lee, Hong Kun Lyu, and Jae Eun Jang " External light Noise-Robust Multi-Touch Screen Using Frame Data Differential
Method "”-IEEE journal 10.11.09/JDT.2015.2438317 , pp 750-763
[8] Marc Bender, Mark Lawford " A low-power, low-cost automotive touchscreen with real controls" – IEEE CCECE 2011
[9] Peter Brandl1, Michael Haller1, Michael Hurnaus1, Verena Lugmayr1, Juergen Oberngruber1,Claudia Oster1, Christian Schafleitner1, Mark Billinghurst2
"An Adaptable Rear-Projection Screen Using Digital Pens And Hand”, Gestures " IEEE 2007/DOI 10.1109/ICAT.2007.12
[10] Katz, Itai, Kevin Gabayan, and Hamid Aghajan. "A multi-touch surface using multiple cameras." Advanced Concepts for Intelligent Vision Systems.
Springer Berlin Heidelberg, 2007.
[11] Chang, Rong, Feng Wang, and Pengfei You. "A Survey on the Development of Multi-touch Technology." Wearable Computing Systems (APWCS), 2010
Asia-Pacific Conference on. IEEE, 2010.
[12] Iezzi, Robert A., Scott Gaboury, and Kurt Wood. "Acrylic-fluoropolymer mixtures and their use in coatings." Progress in organic coatings 40.1 (2000): 55-
60.

More Related Content

What's hot

Hand Gesture Recognition Using OpenCV Python
Hand Gesture Recognition Using OpenCV Python Hand Gesture Recognition Using OpenCV Python
Hand Gesture Recognition Using OpenCV Python Arijit Mukherjee
 
Seminar on blue eyes
Seminar on blue eyesSeminar on blue eyes
Seminar on blue eyessawareji
 
Considerations When Choosing A Smart Camera
Considerations When Choosing A Smart CameraConsiderations When Choosing A Smart Camera
Considerations When Choosing A Smart Cameraguestb4aaef
 
Anti missile diffusion system through MATLAB with Image Processing
Anti missile diffusion system through MATLAB with Image ProcessingAnti missile diffusion system through MATLAB with Image Processing
Anti missile diffusion system through MATLAB with Image ProcessingVeer Singh shakya
 
A Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interactionA Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interactionKunika Barai
 
5 pen-pc-technology-presentation
5 pen-pc-technology-presentation5 pen-pc-technology-presentation
5 pen-pc-technology-presentationPreshin Smith
 
Gesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPTGesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPTSuraj Rai
 
Input devices in computer graphics
Input devices in computer graphicsInput devices in computer graphics
Input devices in computer graphicsAnu Garg
 
Blue eye technology
Blue eye technologyBlue eye technology
Blue eye technologyDivya Mohan
 
Computer control using hand gestures
Computer control using hand gesturesComputer control using hand gestures
Computer control using hand gesturesRohithND
 
IRJET - Eyeblink Controlled Virtual Keyboard using Raspberry Pi
IRJET -  	  Eyeblink Controlled Virtual Keyboard using Raspberry PiIRJET -  	  Eyeblink Controlled Virtual Keyboard using Raspberry Pi
IRJET - Eyeblink Controlled Virtual Keyboard using Raspberry PiIRJET Journal
 

What's hot (19)

Hand Gesture Recognition Using OpenCV Python
Hand Gesture Recognition Using OpenCV Python Hand Gesture Recognition Using OpenCV Python
Hand Gesture Recognition Using OpenCV Python
 
Seminar on blue eyes
Seminar on blue eyesSeminar on blue eyes
Seminar on blue eyes
 
Considerations When Choosing A Smart Camera
Considerations When Choosing A Smart CameraConsiderations When Choosing A Smart Camera
Considerations When Choosing A Smart Camera
 
Smartphone prashant patel
Smartphone prashant patelSmartphone prashant patel
Smartphone prashant patel
 
Blue eyes technology
Blue eyes technologyBlue eyes technology
Blue eyes technology
 
Smart cards
Smart cards Smart cards
Smart cards
 
Anti missile diffusion system through MATLAB with Image Processing
Anti missile diffusion system through MATLAB with Image ProcessingAnti missile diffusion system through MATLAB with Image Processing
Anti missile diffusion system through MATLAB with Image Processing
 
A Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interactionA Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interaction
 
5 pen-pc-technology-presentation
5 pen-pc-technology-presentation5 pen-pc-technology-presentation
5 pen-pc-technology-presentation
 
Gesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPTGesture Recognition Technology-Seminar PPT
Gesture Recognition Technology-Seminar PPT
 
blue_eye_technology_jeevagan
blue_eye_technology_jeevaganblue_eye_technology_jeevagan
blue_eye_technology_jeevagan
 
Input devices in computer graphics
Input devices in computer graphicsInput devices in computer graphics
Input devices in computer graphics
 
publication1
publication1publication1
publication1
 
Blue eye technology
Blue eye technologyBlue eye technology
Blue eye technology
 
Computer control using hand gestures
Computer control using hand gesturesComputer control using hand gestures
Computer control using hand gestures
 
Blue eye Technology
Blue eye TechnologyBlue eye Technology
Blue eye Technology
 
IRJET - Eyeblink Controlled Virtual Keyboard using Raspberry Pi
IRJET -  	  Eyeblink Controlled Virtual Keyboard using Raspberry PiIRJET -  	  Eyeblink Controlled Virtual Keyboard using Raspberry Pi
IRJET - Eyeblink Controlled Virtual Keyboard using Raspberry Pi
 
Blue eyes
Blue eyesBlue eyes
Blue eyes
 
Blue Eye
Blue EyeBlue Eye
Blue Eye
 

Viewers also liked

POTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERS
POTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERSPOTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERS
POTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERSJournal For Research
 
Economia modernizacion del ejido maria
Economia modernizacion del ejido mariaEconomia modernizacion del ejido maria
Economia modernizacion del ejido mariaalejandra alvarez cruz
 
Resumen. ruiz de la fuente josé ignacio.
Resumen. ruiz de la fuente josé ignacio.Resumen. ruiz de la fuente josé ignacio.
Resumen. ruiz de la fuente josé ignacio.Ignacio Ruiz
 
Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)
Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)
Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)Aristides Faria
 
Pp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatan
Pp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatanPp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatan
Pp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatanNovita Prameswati
 
PTCL INITIAL PRESENTATION
PTCL INITIAL PRESENTATIONPTCL INITIAL PRESENTATION
PTCL INITIAL PRESENTATIONSaqib Shakoor
 
FABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITE
FABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITEFABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITE
FABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITEJournal For Research
 
Persépolis Celia Delgado
Persépolis   Celia DelgadoPersépolis   Celia Delgado
Persépolis Celia Delgadolauradrrey
 
IoT: it's all about Data!
IoT: it's all about Data!IoT: it's all about Data!
IoT: it's all about Data!Julien SIMON
 
Elaboración del anteproyecto y proyecto de tesis de grado
Elaboración del anteproyecto y proyecto de tesis de gradoElaboración del anteproyecto y proyecto de tesis de grado
Elaboración del anteproyecto y proyecto de tesis de gradomarcoantoniofuentes123
 
El Cid Vacations Club Reviews the Mazatlán Aquarium
El Cid Vacations Club Reviews the Mazatlán AquariumEl Cid Vacations Club Reviews the Mazatlán Aquarium
El Cid Vacations Club Reviews the Mazatlán AquariumEl Cid Vacations Club
 

Viewers also liked (16)

Immunity
ImmunityImmunity
Immunity
 
POTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERS
POTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERSPOTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERS
POTENTIAL IN INDIA FOR USE OF SOLAR PV IN CHARGING OF ELECTRIC 2 & 3-WHEELERS
 
Economia modernizacion del ejido maria
Economia modernizacion del ejido mariaEconomia modernizacion del ejido maria
Economia modernizacion del ejido maria
 
Neighbors Nextdoor.com
Neighbors Nextdoor.comNeighbors Nextdoor.com
Neighbors Nextdoor.com
 
Resumen. ruiz de la fuente josé ignacio.
Resumen. ruiz de la fuente josé ignacio.Resumen. ruiz de la fuente josé ignacio.
Resumen. ruiz de la fuente josé ignacio.
 
Informe etica
Informe eticaInforme etica
Informe etica
 
Marketing de vinhos Dão
Marketing de vinhos DãoMarketing de vinhos Dão
Marketing de vinhos Dão
 
Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)
Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)
Disciplina Organizacao de Eventos (I) (IFSP Campus Cubatao) (aulas 04 e 05)
 
Pp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatan
Pp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatanPp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatan
Pp no.-72-th-1998-ttg-pengamanan-sediaan-farmasi-dan-alat-kesehatan
 
PTCL INITIAL PRESENTATION
PTCL INITIAL PRESENTATIONPTCL INITIAL PRESENTATION
PTCL INITIAL PRESENTATION
 
FABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITE
FABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITEFABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITE
FABRICATION OF LM6/B4C/GR ALUMINIUM METAL MATRIX COMPOSITE
 
Persépolis Celia Delgado
Persépolis   Celia DelgadoPersépolis   Celia Delgado
Persépolis Celia Delgado
 
2017 lsm automation
2017 lsm automation2017 lsm automation
2017 lsm automation
 
IoT: it's all about Data!
IoT: it's all about Data!IoT: it's all about Data!
IoT: it's all about Data!
 
Elaboración del anteproyecto y proyecto de tesis de grado
Elaboración del anteproyecto y proyecto de tesis de gradoElaboración del anteproyecto y proyecto de tesis de grado
Elaboración del anteproyecto y proyecto de tesis de grado
 
El Cid Vacations Club Reviews the Mazatlán Aquarium
El Cid Vacations Club Reviews the Mazatlán AquariumEl Cid Vacations Club Reviews the Mazatlán Aquarium
El Cid Vacations Club Reviews the Mazatlán Aquarium
 

Similar to Design and Implementation of Camera-Based Interactive Touch Screen

Decibel meter using IoT with notice board
Decibel meter using IoT with notice boardDecibel meter using IoT with notice board
Decibel meter using IoT with notice boardIRJET Journal
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Harin Veera
 
basic computer
basic computerbasic computer
basic computerKirtiBana
 
an overview of computer
 an overview of computer an overview of computer
an overview of computersmartyaana
 
Iot based garbage monitoring system
Iot based garbage monitoring systemIot based garbage monitoring system
Iot based garbage monitoring systemankitguptakishu
 
E- Notice Board Presentation
E- Notice Board PresentationE- Notice Board Presentation
E- Notice Board Presentationayushi jain
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmIOSR Journals
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmIOSR Journals
 
REPORT texto braillefinal
REPORT texto braillefinalREPORT texto braillefinal
REPORT texto braillefinalASWATHI K
 
IRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation Control
IRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation ControlIRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation Control
IRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation ControlIRJET Journal
 
Virtual Projection Interface
Virtual Projection InterfaceVirtual Projection Interface
Virtual Projection InterfaceIJRESJOURNAL
 
Virtual Keypad using IR Technology
Virtual Keypad using IR TechnologyVirtual Keypad using IR Technology
Virtual Keypad using IR TechnologyIRJET Journal
 
Density based traffic light controlling (2)
Density based traffic light controlling (2)Density based traffic light controlling (2)
Density based traffic light controlling (2)hardik1240
 
IRJET- Earthquake Early Warning System for Android
IRJET-  	  Earthquake Early Warning System for AndroidIRJET-  	  Earthquake Early Warning System for Android
IRJET- Earthquake Early Warning System for AndroidIRJET Journal
 

Similar to Design and Implementation of Camera-Based Interactive Touch Screen (20)

MultiTouch
MultiTouchMultiTouch
MultiTouch
 
Decibel meter using IoT with notice board
Decibel meter using IoT with notice boardDecibel meter using IoT with notice board
Decibel meter using IoT with notice board
 
Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data Real Time Head & Hand Tracking Using 2.5D Data
Real Time Head & Hand Tracking Using 2.5D Data
 
basic computer
basic computerbasic computer
basic computer
 
GSM based E-Notice Board
GSM based E-Notice BoardGSM based E-Notice Board
GSM based E-Notice Board
 
an overview of computer
 an overview of computer an overview of computer
an overview of computer
 
Iot based garbage monitoring system
Iot based garbage monitoring systemIot based garbage monitoring system
Iot based garbage monitoring system
 
embedded systems
embedded systemsembedded systems
embedded systems
 
E- Notice Board Presentation
E- Notice Board PresentationE- Notice Board Presentation
E- Notice Board Presentation
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition Algorithm
 
A Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition AlgorithmA Digital Pen with a Trajectory Recognition Algorithm
A Digital Pen with a Trajectory Recognition Algorithm
 
ORIGINAL PROJECT
ORIGINAL PROJECTORIGINAL PROJECT
ORIGINAL PROJECT
 
REPORT texto braillefinal
REPORT texto braillefinalREPORT texto braillefinal
REPORT texto braillefinal
 
Pen10
Pen10Pen10
Pen10
 
IRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation Control
IRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation ControlIRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation Control
IRJET- Gesture Controlled Gloves for Gaming and Power Point Presentation Control
 
Virtual Projection Interface
Virtual Projection InterfaceVirtual Projection Interface
Virtual Projection Interface
 
Virtual Keypad using IR Technology
Virtual Keypad using IR TechnologyVirtual Keypad using IR Technology
Virtual Keypad using IR Technology
 
Touch screen mobile phone
Touch screen mobile phoneTouch screen mobile phone
Touch screen mobile phone
 
Density based traffic light controlling (2)
Density based traffic light controlling (2)Density based traffic light controlling (2)
Density based traffic light controlling (2)
 
IRJET- Earthquake Early Warning System for Android
IRJET-  	  Earthquake Early Warning System for AndroidIRJET-  	  Earthquake Early Warning System for Android
IRJET- Earthquake Early Warning System for Android
 

More from Journal For Research

Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...
Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...
Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...Journal For Research
 
Experimental Verification and Validation of Stress Distribution of Composite ...
Experimental Verification and Validation of Stress Distribution of Composite ...Experimental Verification and Validation of Stress Distribution of Composite ...
Experimental Verification and Validation of Stress Distribution of Composite ...Journal For Research
 
Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...
Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...
Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...Journal For Research
 
A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016
A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016
A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016Journal For Research
 
IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...
IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...
IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...Journal For Research
 
A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015
A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015
A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015Journal For Research
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014Journal For Research
 
A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...
A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...
A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...Journal For Research
 
A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...
A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...
A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...Journal For Research
 
LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012
LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012
LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012Journal For Research
 
DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...
DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...
DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...Journal For Research
 
AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009
AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009
AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009Journal For Research
 
CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008
CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008
CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008Journal For Research
 
AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002
AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002
AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002Journal For Research
 
A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001
A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001
A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001Journal For Research
 
IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021
IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021
IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021Journal For Research
 
USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...
USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...
USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...Journal For Research
 
UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023
UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023
UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023Journal For Research
 
SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024
SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024
SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024Journal For Research
 

More from Journal For Research (20)

Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...
Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...
Design and Analysis of Hydraulic Actuator in a Typical Aerospace vehicle | J4...
 
Experimental Verification and Validation of Stress Distribution of Composite ...
Experimental Verification and Validation of Stress Distribution of Composite ...Experimental Verification and Validation of Stress Distribution of Composite ...
Experimental Verification and Validation of Stress Distribution of Composite ...
 
Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...
Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...
Image Binarization for the uses of Preprocessing to Detect Brain Abnormality ...
 
A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016
A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016
A Research Paper on BFO and PSO Based Movie Recommendation System | J4RV4I1016
 
IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...
IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...
IoT based Digital Agriculture Monitoring System and Their Impact on Optimal U...
 
A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015
A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015
A REVIEW PAPER ON BFO AND PSO BASED MOVIE RECOMMENDATION SYSTEM | J4RV4I1015
 
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
HCI BASED APPLICATION FOR PLAYING COMPUTER GAMES | J4RV4I1014
 
A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...
A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...
A REVIEW ON DESIGN OF PUBLIC TRANSPORTATION SYSTEM IN CHANDRAPUR CITY | J4RV4...
 
A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...
A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...
A REVIEW ON LIFTING AND ASSEMBLY OF ROTARY KILN TYRE WITH SHELL BY FLEXIBLE G...
 
LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012
LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012
LABORATORY STUDY OF STRONG, MODERATE AND WEAK SANDSTONES | J4RV4I1012
 
DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...
DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...
DESIGN ANALYSIS AND FABRICATION OF MANUAL RICE TRANSPLANTING MACHINE | J4RV4I...
 
AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009
AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009
AN OVERVIEW: DAKNET TECHNOLOGY - BROADBAND AD-HOC CONNECTIVITY | J4RV4I1009
 
LINE FOLLOWER ROBOT | J4RV4I1010
LINE FOLLOWER ROBOT | J4RV4I1010LINE FOLLOWER ROBOT | J4RV4I1010
LINE FOLLOWER ROBOT | J4RV4I1010
 
CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008
CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008
CHATBOT FOR COLLEGE RELATED QUERIES | J4RV4I1008
 
AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002
AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002
AN INTEGRATED APPROACH TO REDUCE INTRA CITY TRAFFIC AT COIMBATORE | J4RV4I1002
 
A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001
A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001
A REVIEW STUDY ON GAS-SOLID CYCLONE SEPARATOR USING LAPPLE MODEL | J4RV4I1001
 
IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021
IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021
IMAGE SEGMENTATION USING FCM ALGORITM | J4RV3I12021
 
USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...
USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...
USE OF GALVANIZED STEELS FOR AUTOMOTIVE BODY- CAR SURVEY RESULTS AT COASTAL A...
 
UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023
UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023
UNMANNED AERIAL VEHICLE FOR REMITTANCE | J4RV3I12023
 
SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024
SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024
SURVEY ON A MODERN MEDICARE SYSTEM USING INTERNET OF THINGS | J4RV3I12024
 

Recently uploaded

“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 

Recently uploaded (20)

“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 

Design and Implementation of Camera-Based Interactive Touch Screen

  • 1. Journal for Research| Volume 02| Issue 04 | June 2016 ISSN: 2395-7549 All rights reserved by www.journalforresearch.org 39 Design and Implementation of Camera-Based Interactive Touch Screen Dr. Siny Paul Vishnu P Associate Professor UG Student Department of Electrical and Electronics Engineering Department of Electrical and Electronics Engineering Mar Athanasius College of Engineering, Kothamangalam Mar Athanasius College of Engineering, Kothamangalam Susmitha James Sona Jo Madhavathu UG Student UG Student Department of Electrical and Electronics Engineering Department of Electrical and Electronics Engineering Mar Athanasius College of Engineering, Kothamangalam Mar Athanasius College of Engineering, Kothamangalam Abstract Camera-based Interactive Touch Screen is a touch detection technique that uses a camera to provide a large display with very high spatial and temporal resolutions. The conventional touch screen technology and presentation methods face a range of restrictions. However, the camera-based touch detection can overcome all these restrictions and turn projection screens into interactive touch displays, creating a through-window experience. It uses a coated sheet of glass as the projection surface to form a two-dimensional display. The camera captures images of the projection surface continuously, which are processed by the Atmega16 microcontroller. A UART module connected to the microcontroller, provides asynchronous serial communication with external devices, synchronisation of the serial data stream and recovery of data characters. This technology has several advantages over other touch detection technologies, such as its low cost, simple design and scalable structure. The applications of this technology include advertising, presentations and outdoor displays. Keywords: Camera, Coated Glass, Presentation, Projector, Touch Screen, UART Module _______________________________________________________________________________________________________ I. INTRODUCTION Since the early 1970s, touch screen technologies have become common in many everyday applications. Ranging from ATM machines, to PDAs, to the first commercially available tablet PC developed by IBM in 1993, touch screen technologies find a wide range of uses in our lives. Despite the rapid improvements in this field, it is still difficult to construct an inexpensive touch screen sensor that can register multiple simultaneous points of contact. Past attempts to create touch screens have focused on the use of individual sensors. A touchscreen is an input device normally layered on the top of an electronic visual display of an information processing system. A user can give input or control the information processing system through simple or multi-touch gestures by touching the screen with a special stylus and/or one or more fingers. Some touchscreens use ordinary or specially coated gloves to work while others use a special stylus/pen only. The user can use the touchscreen to react to what is displayed and to control how it is displayed; for example, zooming to increase the text size. Touchscreens are common in devices such as game consoles, personal computers, tablet computers, electronic voting machines and smartphones. They can also be attached to computers or, as terminals, to networks. They also play a prominent role in the design of digital appliances such as personal digital assistants (PDAs) and some books (E-books). The touchscreen enables the user to interact directly with what is displayed, rather than using a mouse, touchpad, or any other intermediate device (other than a stylus, which is optional for most modern touchscreens). The development of multipoint touchscreens facilitated the tracking of more than one finger on the screen; thus, operations that require more than one finger are possible. These devices also allow multiple users to interact with the touchscreen simultaneously. With the growing use of touchscreens, the marginal cost of touchscreen technology is routinely absorbed into the products that incorporate it and is nearly eliminated. Touchscreens now have proven reliability. The ability to accurately point on the screen itself is also advancing with the emerging graphics tablet/screen hybrids. The objectives of the project are to realise a touch screen using a camera and a coated glass sheet, to develop an interactive and user-friendly method of presentation, to increase the size of touch-screen beyond the capabilities of existing technologies and to overcome their drawbacks. II. CAMERA-BASED TOUCH DETECTION Camera-based touch detection systems can be implemented using different methods. Here, we use the laser light plane technique as its immunity to noise is much higher than the shadow detection technique.
  • 2. Design and Implementation of Camera-Based Interactive Touch Screen (J4R/ Volume 02 / Issue 04 / 008) All rights reserved by www.journalforresearch.org 40 The information flows in two ways: there are commands from the computer that changes the different characteristics of the camera, and images from the camera that are sent to the computer. The communication between the computer and the AVR is implemented through the serial port and the communication between the camera and the microcontroller is done using the I2C protocol that can access the different registers of the camera. An 8 bit port is used to read the images. Fig. 1: Block Diagram The surface is illuminated by single or multiple lasers. The laser plane of light is about 1mm thick and proximally positioned to the touch surface. The laser module has a built-in driver. The display is projected on to the coated glass screen by the projector placed behind the screen. The camera behind the screen continuously captures the images. The camera transmits the images as RGB data to the microcontroller. The microcontroller communicates with the system through a UART module [1]-[4]. Camera-Microcontroller Interface The camera-microcontroller interfacing forms the main part of the project. A C3088 color camera module with digital output is used. It uses a CMOS image sensor OV6620 from Omnivision. It has a digital video port that supplies a continuous 8/16 bit-wide image data stream. All the camera functions, such as exposure, gamma, gain, white balance, windowing, can be changed through I2C interface by writing in some registers. The video output can be expressed in different formats, and with different type of channels (RGB, UVY). The information is sent continuously and is synchronized by the HREF, VSYNC and PCLK signals as shown in Figure. The VSYNC signal indicates new frame, the HREF signal indicates if the information is valid and makes the horizontal synchronization, and PCLK is the clock signal (the data is valid in the rising edge). The period of PCLK can be changed by writing in the registers of the camera. This will allow to read images from the camera directly with the microcontroller without the use of additional hardware. The registers accessed by the I2C bus allow to change different properties of the camera. In this project, they are used to change the period of PCLK, to read the size of the image, to make a mirror of the image, and to reset the camera. As the initial frequency of PCLK is 17.73 MHz and the AVR is not fast enough to read each pixel at this frequency, there are two possible solutions:  Use additional hardware to read and store the image.  Decrease the frequency of PCLK by writing in the register 0x11. We use the last method. The frequency taken to read the image depends on the way we read the image. If we read the image by horizontal lines we need to put the lowest frequency allowed which is 69,25KHz. This lets us read one line that is simultaneously stored in the memory of the AVR. On the other hand, if we read a vertical line of the image in each frame, a higher frequency of 260 KHz is used, but we need to read as many frames as vertical lines to get a complete image. In the case of the horizontal line reading, the resulting image is too bright, and that is the reason why the vertical reading is used, even if we need to read as many frames as vertical lines.
  • 3. Design and Implementation of Camera-Based Interactive Touch Screen (J4R/ Volume 02 / Issue 04 / 008) All rights reserved by www.journalforresearch.org 41 Fig. 2: AVR-camera interfacing The horizontal reading is used to read one horizontal line and make a little image process of it. The selection of this frequency was made experimentally trying to use the highest as possible frequency. When an image is to be sent to the computer, the headers and the palette are sent to the computer and then we proceed to reading the image from the camera. We read each frame and column and send the data pixel by pixel through the serial port to the computer [5]. I2C Protocol Inter-Integrated Circuit, abbreviated as I2C, is a serial bus short distance protocol developed by Philips Semiconductor about two decades ago, to enhance communication between the core on the board and various other ICs involved around the core. The most popular serial bus communication protocols available today in the market are, SPI, UART, I2C, CAN, USB, IEE1394, and so on. Philips originally developed I2C for communication between devices inside a TV set. Examples of simple I2C-compatible devices found in embedded systems include EEPROMs, thermal sensors, and real-time clocks. I2C is also used as a control interface for signal processing devices that have separate, application-specific data interfaces. Philips, National Semiconductor, Xicor, Siemens, and other manufacturers offer hundreds of I2C-compatible devices. I2C buses can typically reach speeds up to 400 Kbps. I2C is appropriate for interfacing devices on a single board, and can be stretched across multiple boards inside a closed system. An example is a host CPU on a main embedded board using I2C to communicate with user interface devices located on a separate front panel board. I2C is a two-wire serial bus. The chip select or arbitration logic is not required, making it cheap and simple to implement hardware. The two important I2C signals are serial data and serial clock. Together, these signals make it possible to support serial transmission of 8-bit data. i.e. 7-bit device addresses plus control bits-over the two-wire serial bus. An I2C slave can hold off the master in the middle of a transaction using what's called clock stretching (the slave keeps SCL pulled low until it's ready to continue). Most The I2C protocol can also support multiple masters. There may be one or more slaves on the bus. Both masters and slaves can receive and transmit data bytes. Each I2C-compatible hardware slave device comes with a predefined device address, the lower bits of which may be configurable at the board level. The master transmits the device address of the intended slave at the beginning of every transaction. Each slave is responsible for monitoring the bus and responding only to its own address. This addressing scheme limits the number of identical slave devices that can exist on an I2C bus without contention, with the limit set by the number of user-configurable address bits. The I2C signaling protocol provides device addressing, a read/write flag, and a simple acknowledgement mechanism. Other elements of I2C protocol are general call (broadcast) and 10-bit extended addressing. Standard I2C devices operate up to 100Kbps, while fast-mode devices operate at up to 400Kbps. Most often, the I2C master is the CPU or microcontroller in the system. Some microcontrollers even feature hardware to implement the I2C protocol. You can also build an all-software implementation using a pair of general-purpose I/O pins. Since the I2C master controls transaction timing, the bus protocol doesn't impose any real-time constraints on the CPU beyond those of the application. For a fixed I2C the high and low logics are defined at 3.0 V and 1.5 V. For dependant I2C, these are defined at 0.7*Vdd and 0.3*Vdd respectively. The pull-up resistor values required for I2C are typically at 1K for 3.0V of Vdd and 1.6K for 5V of Vdd. Typical operating temperatures are between -40 degrees and +85 degrees Centigrade [6].
  • 4. Design and Implementation of Camera-Based Interactive Touch Screen (J4R/ Volume 02 / Issue 04 / 008) All rights reserved by www.journalforresearch.org 42 Operation Laser light plane (LLP) technique used here is one of the newest illumination and tracking methods for surface touches. As opposed to the other optical sensing techniques that employ several arrays of LEDs to generate radiation, the LLP method uses commonly available laser diodes paired with line generating lenses. By placing several lasers in the corners of the multitouch device, a light plane is generated that will cover the whole active surface area. Depending on the size of the device, 2 to 8 lasers are commonly employed. Generally, two lasers placed infront of the screen would generate enough radiation to extend the light plane to the required dimensions. Since the light plane can be blocked by placing a finger or an object there is a need for an illumination redundancy where even the hidden markers receive sufficient lighting in order to scatter the radiation towards the sensing element. The laser plane is 1 mm in thickness and is positioned infront of the surface at a distance varying from 0.5 to 2 mm. When an object comes in contact with the laser plane, radiation is scattered towards the camera that acts as the radiation detector. The common laser wavelength used in this method is 635 nm for the increased availability of optical filters and greater environment immunity. The display is projected on to a coated glass screen by the projector placed behind the screen. The camera behind the screen continuously captures the images. .When a finger touches the light plane, it gets lighted up by an arrangement of plane red laser light modules infront of the screen. Thus, the point of touch is viewed by the camera as an intense red spot. The camera transmits the images as RGB data to the microcontroller which calculates the x and y coordinates of the point with highest intensity of red. The data is then sent to the system through a UART module to produce the required response. Thus touch is generated on the glass surface [7-11]. Fig. 3: Laser Light Plane Technique III. HARDWARE DESCRIPTION The various hardware components used in the project are: Coated Glass The sheet of glass used as the touch screen is coated with a layer of acrylic, so as to form an image on the screen. Acrylic is a useful, clear plastic that resembles glass. Common brands of acrylic include polycast, lucite, plexiglass or optix. It is available in a variety of thicknesses, colours and niche. The most popular uses for acrylic signage include facility signs and interior dimensional lettering. It can be printed, routered, engraved, beveled, polished, painted, cut, glued, bent, drilled and lasered. Acrylic does not blast well[12]. Camera C3088 The C3088 is a 1/4” colour camera module with digital output. It uses OmniVision’s CMOS image sensor OV6620. Combining CMOS technology together with an easy to use digital interface makes C3088 a low cost solution for higher quality video image application. The digital video port supplies a continuous 8/16 bit -wide image data stream. All camera functions, such as exposure, gamma, gain, white balance, color matrix, windowing, are programmable through I2C interface. Atmega16 microcontroller The ATmega16 is a low-power CMOS 8-bit microcontroller based on the AVR enhanced RISC architecture. By executing powerful instructions in a single clock cycle, the ATmega16 achieves throughputs approaching 1 MIPS per MHz allowing the system designer to optimize power consumption versus processing speed.
  • 5. Design and Implementation of Camera-Based Interactive Touch Screen (J4R/ Volume 02 / Issue 04 / 008) All rights reserved by www.journalforresearch.org 43 Plane Laser Light Module These are encapsulated laser diodes of Class IIIa 5mW, with a 650nm red wavelength. They can be driven from 2.8V to 5.2V. This particular laser diode has a lens attached to turn the dot into a line. The line beam spread is 120°. The line fades out and doesn't have a sharp cut-off. It is not a laser pointer, but a diode with an integrated driver and requires a 3 V to 5 V DC power supply. It has a 5mW output, and a safety label marked on it. UART Module A UART (Universal Asynchronous Receiver/Transmitter) is the microchip with programming that controls a computer's interface to its attached serial devices. Specifically, it provides the computer with the RS-232C Data Terminal Equipment (DTE) interface so that it can "talk" to and exchange data with modems and other serial devices. IC Voltage Regulator Voltage regulator ICs are available with fixed (typically 5, 12 and 15V) or variable output voltages. They are also rated by the maximum current they can pass. Negative voltage regulators are available, mainly for use in dual supplies. Most regulators include some automatic protection from excessive current ('overload protection') and overheating ('thermal protection'). Many of the fixed voltage regulator ICs has 3 leads and look like power transistors, such as the 7805 +5V 1Amp regulator. They include a hole for attaching a heat sink. Projector A projector or image projector is an optical device that projects an image (or moving images) onto a surface, commonly a projection screen. Most projectors create an image by shining a light through a small transparent lens, but some newer types of projectors can project the image directly, by using lasers. Projector is a highly versatile tool for presentation. Here we use a projector (BENQ/EPSON) to project the computer screen to the plane glass with the help of thin acrylic sheet. PC/Laptop We do not need to build a computer specifically for this project. In fact, any recent computer with a decent processor and graphics card should work well. However, if you plan on putting the surface in an enclosure when you are done, then ensure the computer has adequate cooling and minimal thermal dissipation. IV. EXPERIMENTAL SETUP AND RESULT Various parts of the system are constructed and connected together to work in unison. The mechanical parts do not interact with the electrical parts, which simplifies the design. The main requirement of the mechanical parts is to hold the system together while allowing adjustments to be made regarding the positioning of the components relative to one another. The main parts that need to be held in place are: the screen, the LEDs, the projector and the camera. The screen size settled on was 30” diagonal and had an aspect ratio of 4:3. The final product needs to be sturdy, adjustable, upgradeable and portable. The system should be able to work both standing up, which is easiest for a single user, and also lying down to accommodate a large group of people. Hardware Implementation The three main components of our hardware design are as follows:  Laser module  Camera and its associated circuitry  Outer casing for the entire device. Fig. 4: Plane Laser Light Module
  • 6. Design and Implementation of Camera-Based Interactive Touch Screen (J4R/ Volume 02 / Issue 04 / 008) All rights reserved by www.journalforresearch.org 44 Laser Module: Our original plan, at the time of the project proposal, was to use an infrared laser to detect button presses using the CMOS camera, but we realized that user safety would be a major issue in that case. The user would never know even if he/she is staring directly at the laser and, therefore, there would be no way to prevent eye damage. In addition, we also realized that the CMOS camera we’re using (OV6630) is not very effective at detecting infrared light. Hence, we decided to use a Class IIIa 635 nm red laser instead (Fig. 4). The laser module we bought came with a built-in driver; therefore, we didn’t have to worry about biasing the laser properly to make it operational. All we had to do was to connect the laser to a 3V power source, which we obtained using a simple 3V voltage regulator. The laser module also came with a line-generating diffractive optical element attached to it. However, since we didn’t know the fan-angle for this DOE, we had to experiment with various distances in order to obtain a line length of at least 8.5”, which was required to cover the entire screen. In the end, we had to place the laser at a distance of approximately 12.5” to obtain good results. Camera and its Associated Circuitry: We decided to use the C3088 1/4” color sensor module with digital output, which uses OmniVision’s CMOS image sensor OV6630. The two primary reasons why we chose this specific camera module were its low cost and the fact that it is capable of outputting image color data in progressive scan mode. Progressive scanning was an important consideration for us since we don’t have enough computational power available on the 16Mhz Atmega16 microcontroller to process entire frames at once; however, we can certainly process images line-by-line as they come in. After rigorous testing, we realized that we could work with only the red channel data from the camera. Hence, we connected the 8-bit red channel output from the camera (UV[7:0]) to PORTA[7:0] on the Mega16. We decided to use a resolution of 176x144. At this resolution, we could capture at most 6 frames of color images per second. The camera output format was set to capture 16 bit UV/Y data, where UV had BRBR data and Y had GGGG data. The Y data was completely ignored. Outer Casing for the Entire Device: The hardware assembly for our device is designed to hold the camera at a fixed position. Software Implementation The software component was split into 5 main components: 1) Implementing the I2C protocol to read and write registers from camera 2) Reading values from camera to obtain 6 frames every second 3) Processing the images 4) Sending serial data to update the array of scan codes in the Mega16Camera and its associated circuitry At first we initialize PORTA on the Mega16 to take UV input from the camera and PORTC to communicate with the camera over the I2C interface. The baud rate is set to 19,200bps for serial communication. We then run the calibrate function on the camera. Then we call a function called "init_cam" which performs a soft reset on the camera before writing the required values to corresponding camera registers. These registers change the frame size to 176x144, turn on auto white balance, set the frame rate to 6 fps, and set the output format to 16-bit on the Y/UV mode with Y=G G G G and UV = B R B R. The code then enters an infinite loop which checks for the status of the PS2 transmitting queue and tries to process the next captured frame if the queue is empty. If not, the queue is updated and the PS2 transmission is allowed to continue. The microcontroller captures rows of data from the camera and processes each of them. We first wait for a negative edge on the VSYNC, which indicates the arrival of new frame data on the UV and Y lines. Then, we wait for one HREF to go by since the first row of data is invalid. At this point, we can clock in 176 pixels of data for a given vertical line. In the mode where the UV line receives BR data, the output is given by: B11 R22 B13 R24 and so on. Since we only needed red data, we keep an array of 88 values in which we store the data on the stays negative for about 0.8ms and the camera data becomes invalid UV line every 2 PCLKS. The OV6630 also repeats the same set of pixels for consecutive rows and thus 2 vertical lines processed would have data about the same pixels. Since we don’t have enough memory to store entire frames of data to process, we do the processing after each vertical line. After each vertical line of valid data, HREF; this gives us ample time to process one line worth of data. After each vertical line was captured, we looped through each pixel to check if it exceeded the red threshold found during calibration. For every pixel that met this threshold, we then checked if the pixel was part of a contiguous line of red pixels, which would indicate a touch. If such a pixel was found, we then mapped this pixel to a scan code by binary searching through an array of x, y values. If this scan code was found to be valid, we calculated the time for which the red pixel remained stationary, and then decided the action to be performed. i.e. mouse-click or pointer movement. Experimental Setup The whole system is arranged as shown in the Fig. 5 and 6.
  • 7. Design and Implementation of Camera-Based Interactive Touch Screen (J4R/ Volume 02 / Issue 04 / 008) All rights reserved by www.journalforresearch.org 45 Fig. 5: Hardware Setup Fig. 6: Display Screen Result The project was tested successfully in the presence of teachers and students. The hardware setup was done and the project was tested for controlling the mouse pointer and performing the 'left-click' action. The future developments that can be incorporated are:  Developing a stylus pen for 'write' and 'copy-paste' actions.  Upgrading the single touch screen to a multi-touch one. V. CONCLUSION Touch technology has come a long way in the last decade. Just six years ago, most phones used traditional keypads; today, almost all smartphones have a touchscreen, and the technology has spread to tablets, handheld consoles and laptops as well. Although camera-based touch detection systems appear large and complicated, they are extremely useful in cases where the size of the display is very large. The system has several advantages such as absence of a compliant surface and LED frames, can use any transparent material as the projection surface, is slightly cheaper than other technologies and has a simple setup. However, it has many disadvantages including inability to track traditional objects and fiducials, insensitivity to pressure and may cause occlusion (in case of multi- touch) when only one laser is used. i.e. light hitting one finger blocks another finger from receiving light. The applications of the project include user-friendly presentation boards, advertising screens, route maps etc. REFERENCES [1] Jain, Anjul, Diksha Bhargava Bhargava, and Anjani Rajput. "Touch-screen Technology." International Journal of Advanced Research in Computer Science and Electronics Engineering (IJARCSEE) 2.1 (2013): pp-074. [2] Rosin, Hanna. "The touch-screen generation." The Atlantic 20 (2013). [3] J. Hu, G. Li, X. Xie, Z. Lv and Z. Wang, "Bare-fingers Touch Detection by the Button's Distortion in a Projector–Camera System," in IEEE Transactions on Circuits and Systems for Video Technology, vol. 24, no. 4, pp. 566-575, April 2014. [4] Katz, Itai, Kevin Gabayan, and Hamid Aghajan. "A multi-touch surface using multiple cameras." Advanced Concepts for Intelligent Vision Systems. Springer Berlin Heidelberg, 2007. [5] Kaur, Ramandeep. "To develop an interface among C3088 camera and computer using AVR microcontroller.",unpublished [6] Hu, ZhenMuzaffar, Junaid. "Inter-Integrated-Circuit (I2C)." (2005). [7] Gwang Jun Lee, Sang Kook Lee, Hong Kun Lyu, and Jae Eun Jang " External light Noise-Robust Multi-Touch Screen Using Frame Data Differential Method "”-IEEE journal 10.11.09/JDT.2015.2438317 , pp 750-763 [8] Marc Bender, Mark Lawford " A low-power, low-cost automotive touchscreen with real controls" – IEEE CCECE 2011 [9] Peter Brandl1, Michael Haller1, Michael Hurnaus1, Verena Lugmayr1, Juergen Oberngruber1,Claudia Oster1, Christian Schafleitner1, Mark Billinghurst2 "An Adaptable Rear-Projection Screen Using Digital Pens And Hand”, Gestures " IEEE 2007/DOI 10.1109/ICAT.2007.12 [10] Katz, Itai, Kevin Gabayan, and Hamid Aghajan. "A multi-touch surface using multiple cameras." Advanced Concepts for Intelligent Vision Systems. Springer Berlin Heidelberg, 2007. [11] Chang, Rong, Feng Wang, and Pengfei You. "A Survey on the Development of Multi-touch Technology." Wearable Computing Systems (APWCS), 2010 Asia-Pacific Conference on. IEEE, 2010. [12] Iezzi, Robert A., Scott Gaboury, and Kurt Wood. "Acrylic-fluoropolymer mixtures and their use in coatings." Progress in organic coatings 40.1 (2000): 55- 60.