Monday, September 1, 2008

EngiSeminarGuru


Seminars Topics











This Blog Gives All Seminar Topics information to the Engineering Students.





The Project and Seminars plays a very important role in Engineering Studies.


Seminar List



  • BrainGate

  • ABSTRACT
    As the power of modern computers grows alongside our understanding of the human brain, we move ever closer to making some pretty spectacular science fiction into reality.
    Imagine transmitting signals directly to someone's brain that would allow them to see, Hear or feel specific sensory inputs. The biotech company Cyber kinetics developed BrainGate in 2003 in conjunction with the Department of Neuroscience at Brown University. The device was designed to help those who have lost control of their limbs, or other bodily functions.
    The computer chip, which is implanted into the brain, monitors brain activity in the patient and converts the intention of the user into computer commands... The brain-computer interface has barely emerged as a milieu for useful and important research. Solving even a few of the vast unknowns associated with brain-function and the major difficulties inherent in current apparatus may yield exciting results. As detection techniques and experimental designs improve, the BCI very likely will provide a wealth of alternatives for individuals to interact with their environment.





  • GridComputing

  • What is grid computing?
    Grid is an emerging technology, grid computing can mean different things to different people, but here is a simple, serviceable definition for the concept of grid computing.Grid computing allows you to unite pools of servers, storage systems, and networks into a single large system so you can deliver the power of multiple-systems resources to a single user point for a specific purpose. To a user, datafile, or an application, the system appears to be a single enormous virtual computing system. Grid computing is the next logical step in distributed networking. Just as the Internet allows users to share ideas and files as the seeds of projects, grid computing lets us share the resources of disparate computer systems so people can actually start working on those projects.
    The major purpose of a grid is to virtualize resources to solve problems. The main resources grid computing is designed to give access to include, but are not limited to:
    Computing/processing power.
    Data storage/networked file systems.
    Communications and bandwidth.
    Application software.



  • QuantumComputing

  • ABSTRACT:-
    This paper attempts to investigate the approach of Quantum Computing to bring about a new order in the computing technology. A bit is a fundamental unit of information, classically represented as a 0 or 1 in your digital computer. Each classical bit is physically realized through a macroscopic physical system, such as the magnetization on a hard disk or the charge on a capacitor . A quantum computer uses quantum bits or qubits,which are not binary but rather more quaternary in nature. This qubit property arises as a direct consequence of its adherence to the laws of quantum mechanics which differ radically from the laws of classical physics. A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition of these classical states.Richard Feynman was among the first to recognize the potential in quantum superposition for solving such problems much faster. For example, a system of 500 qubits, which is impossible to simulate classically, represents a quantum superposition of as many as 2500 states. Each state would be classically equivalent to a single list of 500 1's and 0's. Any quantum operation on that system --a particular pulse of radio waves, for instance, whose action might be to execute a controlled-NOT operation on the 100th and 101st qubits-- would simultaneously operate on all 2500 states. Hence with one fell swoop, one tick of the computer clock, a quantum operation could compute not just on one machine state, as serial computers do, but on 2500 machine states at once! Eventually, however, observing the system would cause it to collapse into a single quantum state corresponding to a single answer, a single list of 500 1's and 0's, as dictated by the measurement axiom of quantum mechanics. The reason this is an exciting result is because this answer, derived from the massive quantum parallelism achieved through superposition, is the equivalent of performing the same operation on a classical super computer with ~10150 separate processors (which is of course impossible)!!!





  • ARTIFICIAL INTELLIGENCE AND ITS APPLICATIONS

  • ABSTRACT:
    The development, strength, economy and progress depends on the status of the Defence, This paper deals with various generations of computer where the fifth generation computers use Artificial Intelligence. A few applications of Artificial Intelligence used in various technologies have also been discussed. The main focus of paper is on chaser missiles, which implements the missile technology using Artificial Intelligence. The Anupama processor designed in India can be used in chasers. This shows the development of Artificial Intelligence in Indian defence systems. The On-Board Computer department of RCI and ASL of DRDO are working on similar missile technology mostly on the guidance systems for Agni, Trishul, Akash and NAG along with which even the Chaser using the AI technology may replace the existing guidance system of the missiles.





  • Cell Processor

  • ABSTRACT:
    This paper attempts to present the invention of cell processors. The Cell processor is designed for new Play station 3 promises seemingly obscene computing capabilities and to give awesome graphics for ultimate gaming experience. Although it’s been primarily touted as the technology for the PlayStation 3, Cell is designed for much more. Next generation consumer technologies such as BluRay, HDTV, HD Camcorders and of course the PS3 will all require a very high level of computing power and this is going to need chips to provide it. Cell will be used for all of these and more; IBM will also be using the chips in servers.
    Key Words:
    GOPS: Billions of Operations per Second.
    GFLOPS: Billions of floating point operations per second.
    APU: Attached Processing Unit.
    DMAC: Direct Memory Access Controller.
    PU: Processing Unit.
    GPU: Graphical Processor Unit.
    1. SPECIFICATION:
    An individual Processing Element (i.e. Hardware Cell) is made up of a number of elements:
    Processing Unit (PU)
    8 X Attached Processing Units (APUs)
    Direct memory Access Controller DMAC
    Input/Output (I/O) Interface
    2.1 Some Important Specification
    Capable of running at speeds beyond 4 GHz.
    Memory bandwidth: 25.6 GBytes per second
    I/O bandwidth: 76.8 GBytes per second.
    .256 GFLOPS (Single precision at 4 GHz).
    256 GOPS (Integer at 4 GHz).
    25 GFLOPS (Double precision at 4 GHz).
    235 million transistors.
    10 Digital Thermal sensors.
    Power consumption has been estimated at 60 - 80 Watts at 4 GHz for the prototype but this could change in the production version.



  • Computing with DNA

  • ABSTRACT:
    As the world is moving towards the miniaturization technology, there is a need for efficient memory utilization. This forces many recent technologies to emerge and to overcome already existing CMOS based designs. The usage of DNA as a memory storage device is one of such recent technology to achieve miniaturization constraints. Computing with DNA holds out the promise of important and significant connections between computers and living systems, as well as promising massively parallel computations. Before these promises are fulfilled, however, important challenges related to errors and practicality with DNA computing has to be addressed. On the other hand, new directions towards designing molecular algorithms for DNA computing might circumvent the problems that have hindered development, so far. In this paper, the state of DNA computing that has developed from Adleman’s work is reviewed. The purpose is not to produce a comprehensive bibliography of the field (1), rather, the purpose is to summarize the main trends in the field, to identify the key challenges for the researches, and to speculate on future developments.





  • DNA: THE FUTURE OF COMPUTING

  • ABSTRACT:
    DNA computing is a nascent technology that seeks to capitalize on the enormous informational capacity of DNA, biological molecules that can store huge amounts of information and are able to perform operations similar to a computer's through the deployment of enzymes, biological catalysts that act like software to execute desired operations.
    This paper gives an insight into evolution and the future of DNA computing. Scientists around the globe are now trying to marry computer technology and biology by using nature's own design to process information. Research in this area began with an experiment by Leonard Adleman, a computer scientist at USC who surprised the scientific community in 1994 by using the tools of molecular biology to solve a hard computational problem.
    In terms of speed and size, however, DNA computers surpass conventional computers. While scientists say silicon chips cannot be scaled down much further, the DNA molecule found in the nucleus of all cells can hold more information in a cubic centimeter than a trillion music CDs. A spoonful of DNA contains 15,000 trillion computers. While a desktop PC is designed to perform one calculation very fast, DNA strands produce billions of potential answers simultaneously. This makes them suitable for solving "fuzzy logic" problems that have many possible solutions rather than the either/or logic of binary computers. In the future, some speculate, there may be hybrid machines that use traditional silicon for normal processing tasks but have DNA co-processors that can take over specific tasks they would be more suitable for.





  • Embedded System, Processor And EDA Tools

  • ABSTRACT:
    This paper describes about the main features and functions of the Pentium® 4 processor micro-architecture. We present the front-end of the machine, including its new form of instruction cache called the Execution Trace Cache. We also describe the out-of-order execution engine, including the extremely low latency double-pumped ALU (Arithmetic Logic Unit) that runs at more than 3.4 GHz. We also discuss the memory sub-system, including the very low latency Level 1 data cache that is accessed in just two clock cycles. We then touch on some of the key features that allow the Pentium 4 processor to have outstanding floating-point and multi-media performance. We also mention about various Processors like Pentium Pro, Intel processors, Pre Pentium Processor, the Pentium processor in brief. This paper also includes about Advanced Micro Device Processor (AMD) like Motorola’s processors, Athlon processors. Some key performance numbers for this processor, comparing it to the Pentium® III processor are given.
    Designing a mobile processor calls for different power/ performance tradeoffs than designing a traditional high-performance processor. In this paper, the design philosophy that was adopted by the Intel Pentium M processor’s architects to achieve best performance at given power and thermal constraints are explained Now to integrate EDA tools to enable interoperability and ease of use has been a very time-consuming and complicated job. Conventionally, each tool comes with a unique and simple set of commands for interactive use such as Sis, Vis, and Magic, but it discourages further exploration to the underlying system functionality and it lacks full programming capability of a scripting language. Not only the code is hard to reuse, but also rapid prototyping of a new algorithm is impossible. A new algorithm may take years to develop, and it has to
    be started from scratch. Even though a new algorithm is developed, it may struggle between various formats. In this paper, we study and address how to easily integrate Application program interface (API)'s into most popular scripting languages such as Tcl or Perl. This paper also explains about Embedded Systems in which we mentioned about its various characteristics, platform, tools, operating system, design of embedded systems and some examples of embedded systems.





  • Embedded Systems as a Open Control Platform

  • ABSTRACT:
    The Open Control Platform (OCP) is an important applications that are impractical or intractable using current approaches. The OCP provides an open, middleware-enabled software framework and development platform for controls technology researchers who want to demonstrate their technology in simulated or actual embedded system platforms. The embedded system domain of particular interest to the SEC program is that of uninhabited aerial vehicles ( UA Vs).
    This paper describes the OCP approach to handling the challenges of UA V control, including support for all levels of vehicle control (including low-level flight control), interaction between a UAV and other platforms (e.g., another UA V, ground station, control by a piloted aircraft), innovative scheduling techniques, adaptive resource management, and support for dynamic reconfiguration.





  • FPGA Implementation of AES Encryption and Decryption

  • ABSTRACT:
    This paper presents a high speed, fully pipelined FPGA implementation of AES Encryption and Decryption (acronym for Advance Encryption Standard, also known as Rijndael Algorithm) which has been selected as New Algorithm by the National Institutes of Standards and Technology (NIST) as US FIPS PUB 197 in November 2001. The AES algorithm is implemented on the FPGA. The C source for AES is already provided to work in software. The application works in the following manner. Any file can be encrypted in software and then it may be transferred to the machine containing FPGA for decryption. And also vice-versa i.e. the file may be encrypted in hardware and decrypted in software. The hardware design mainly concentrates on the speed up along with silicon area optimization. The work-steps includes Paper Designing, Writing VHDL Code (Very high speed integrated circuit Hardware Descriptive Language), Simulating the code on "ModelSim SE PLUS 5.8b", Synthesizing & Implementing (i.e. Translate, Map & Place and Route) the code on "Xilinx - Project Navigator, ISE 7i " with Device XC2V6000 of Vertex II Family & XST Synthesis Tool and finally implemented on ALPHA DATA board with XILINX VERTEX II PMC - 2V6000 FPGA series of ADM-XRC-II. The ADM-XRC-II is a high performance reconfigurable PMC (PCI Mezzanine Card) based on the Xilinx Virtex-II range of Platform FPGAs. The design has maximum frequency: 76.699MHz. It has been interfaced to Host PC with ZBT (Zero Bus Turnaround) RAM on AlphaData Hardware plane through "C" programming language and ZBT RAM with our own AES Module through User Module (which is already provided).



  • IPv6: “Upgrading the Internet”

  • ABSTRACT
    The following paper stresses the need of Internet Protocol version 6 (IPv6) in the ever-growing world of Internet. IPv6 is designed to overcome the weaknesses of IPv4, to enable new computing and communications paradigms, and to provide a flexible and operationally robust platform for future Internet growth.
    The exponential growth of the Internet resulted in impending exhaustion of the IPv4 address space. Already, the limits of IPv4's addressing system have started pushing the usage of techniques such as Network Address Translation (NAT) and the Classless Inter-domain Routing (CIDR) to generate more IP addresses. Inspite of these techniques, developing countries like India are facing shortage of IP addresses and need upgradation. Paper highlights new services that need the benefits of IPv6 (larger address space, security, end-to-end service) that will drive IPv6 commercial implementations. We also examine the current state of IPv6 in terms of experimental and real-life implementations. The paper further deals with perceived benefits of IPv6, key differences between IPv4 and IPv6, steps involved in transition mechanisms for IPv6, cost implication of IPv6 transition and more.
    IPv6 thus holds the honour for “Next Generation Internet Protocol.”
    Keywords: paradigms, interdomain, routing, Internet protocol.





  • Knowledge Engineering Based Intelligent Tutoring System For Patient Education

  • ABSTRACT
    Educating patients in a cogent manner results in quick recovery and sustained good health. Hence it has become the need of the hour in the medical domain to develop effective methods of patient education. A brief review of the methods prevalent today is presented and their shortcomings have been analyzed. A novel approach to handle the problem of systematically structuring this abstract domain is suggested. Making use of the immense advances in information technology a practical solution based on intelligent tutoring system is proposed for educating the patients effectively and efficiently. The various stages in the development of an intelligent system are illustrated and relevant issues have been discussed.





  • Linux Kernel Evolutions and Module Coupling Problem

  • ABSTRACT
    Common coupling (sharing global variables across modules) is a metric for software quality, and has been used in studies of maintainability. But when the global variables in question are referenced in the software like kernel one must forecast the behavior of operating system. We explore this issue by analyzing a case study based on the Linux system. In particular, our conclusion for coupling based on individual fields are similar in spirit to the results reported previously (by others) based on using complete data structures. We will particularly consider the Linux system, as its code is available for study.
    There are about 365 versions of Linux. For every version, there are number of instances of common (global) coupling between each of the 17(core) kernel modules and all the other modules in that version of Linux. So number of instances of common coupling increases with the version number as an updated Linux kernel will have more number of modules or larger code size. On the other hand the number of line of code linearly grows with the version number. So unless Linux restructured with minimum level of common coupling it may hard to maintain and manage the Linux kernel. And at some future date this will make Linux exceedingly hard to maintain. So in this paper we are trying to present the problem of common coupling occurred in Linux kernel due to its independent development by random programmers spread across the world.



  • Neural Networks

  • ABSTRACT
    This report is an introduction to Artificial Neural Networks. The Various types of neural networks are explained and demonstrated, Applications of neural networks like ANNs in medicine are described, and a detailed historical background is provided. The connection between the artificial and the real thing is also investigated and explained. Enginee- ring approach for a description of a simple neuron, firing rules to make neuron decisions, pattern recognition and explanation for a complicated neurons. Architecture of neural network includes feed forward networks, feed backward Networks, network layer, perceptrons. Applications of neural network in medi-Cine, instant physician, electronic nose, business, market.
    The paper will be an overall data based on the neural network, which may prove helpful to the researchers in this field of study.





  • Research On Making Data Communication More Secure
    - By Inhering Steganography In Cryptography

  • Abstract
    The goal of cryptography is to provide confidentiality and privacy by encrypting the information. If this information is deciphered (by unauthorized human body) then this goal is defeated.
    In this paper we present a technique that allows cryptography to inherit some features from steganography in such a way that the information is encrypted as well as hidden. The proposed solution is to use image to hide textual data in the image. The process is to transfer encrypted textual data but in image format. This approach has a unique disguising feature; since images are often used in context of steganography, (Here steganography is not used but its features are inherited) anyone who finds suspicious images would try to do steganalysis it rather than cryptanalysis. The proposed technique provides enhanced confidentiality and privacy in personal communication.





  • Wearable Computers

  • ABSTRACT:
    Computer technology has played an important role in businesses throughout the years. There has been active development of increasingly portable computer hardware. The development originated with desktop and laptop units and is becoming increasingly apparent in palmtop, handheld and now wearable computers.
    A wearable computer is a computing device small and light enough to be worn on one's body without causing discomfort. Unlike a laptop or a palmtop , wearable computer is constantly turned on and interacts with the a real-world task. Information could be even very context sensitive.Wearable computers are gradually transforming our technology by reshaping the bulky desktop computer into a small, lightweight device as unnoticeable as clothing, which is accessible all the time.
    Sometimes the location of a desktop or laptop computer is inconvenient or inefficient. When accurate information is not available in a timely manner, production decreases. This is a problem for many businesses throughout the world. With rising costs and demand for increased efficiency, wearable computers give personnel real-time access to critical information.
    The wearable computer provides the ultimate in network access handsfree, headsup operation with complete mobility and ample counting power.We have all technologies needed to make a viable wearable computer today, the technology which adds value to the human knowledge, memory and intelligence.
    This report discusses the lightweight ultra mobile technology “The wearable computers” thoroughly which is sure to become the worlds next mania.





  • Wi – Fi A New Era In Networking Technology

  • ABSTRACT:
    The enhancements in Internet technology and network technology over the last decade can be leveraged effectively to build Wireless fedility (WI-FI) systems, which can enable rapid expansion of telecom and Internet access in developing countries. However, the design of a WI-FI system requires one to understand some fundamentals concerning the Access Network and its connectivity to backbone network as well as the traffic requirement for a voice and Internet connection. Equally important is the concern for capacity and spectral efficiency, especially as higher bit-rate Internet systems becomes a must for developing countries to get a fair share of the economic advantages that telecom technologies provide. The paper further takes a brief look at some recent technological developments, which are likely to impact the Wireless networking technology.





  • Wireless mesh networks – A next generation high performance wireless networking system
    ---a conceptual study

  • Abstract :
    As the wireless networking technology, marching towards a substantial growth, there is a need for a self-organized and self-configured wireless networking systems to suit the needs for the next generation. In this paper a conceptual study has been made on a key technology, namely , wireless mesh networks ( WMN s ). Considerable progress is going on the development of WMNs at different levels. However many technical issues are still existing in this field. This article describes the network architecture of WMNs and the current state – of – art protocols and algorithms for WMNs . Some of the open research issues in the protocol layers are also discussed in this article





  • Smartphone: an Embedded System for Universal Interaction

  • INTRODUCTION:
    Recent advances in technology make it feasible to incorporate Significant processing power in almost every device that we encounter in our daily life. The embedded systems are heterogeneous, distributed everywhere in the surrounding environment, and capable of communicating through wired or wireless interfaces. Embedded systems are generally electronic devices that incorporated microprocessors with in their implementation. A microprocessor in the device remove bugs, making modifications, or adding new features of writing the software that controls the device. Ideally, we would like to have a single device that acts as both personal server and personal assistant for remote interaction with embedded systems located in proximity of the user. This device should be programmable and support dynamic software extensions for interaction with newly encountered embedded systems (i.e., dynamically loading new interfaces). Smart Phone is an emerging mobile phone technology that supports Java program execution or BREW Technology and provides both short range wireless connectivity (Bluetooth) and cellular network connectivity through which the Internet can be accessed





    • Secure File System

    ABSTRACT:
    This paper attempts to investigate the approach of secure file system to protect it from VIRUS attack .
    Now-a-days we are aware of VIRUSes, we need a permanent solution to protect our filesystem. In this way, we suggest a solution for secure file system handling, by using encryption and decryption on the file system storage.





  • Optical computer

  • ABSTRACT:
    In the modern age computer must perform tasks as with the speed of light, so that it is able to control robot in space, 50 million kms away.
    This bring about the idea to build an “Optical Computer”, that work open on light(photons) instead
    of electrons or combination of both.
    Research is in full progress into a next generation computer utilizing the properties of light . The so-called “Optical Computer” of the future is the subject of discussion by researchers all over the world, as to its possibilities and problems along the way to realization.
    Optical switch ,optical gates, function interconnection module, pipelined processor, storage elements, optical array logic, image processing and to have detail about light and molecules used to make thin films of optical computer. Thus this fast speed optical computer will make some “impossible thing to look like ‘I am possible’”…. Optics is entering all phases of computer technology. How and why optics is likely to be used in next generation computers and optical devices for optical computing, optical associative memories, optical interconnections and optical logic.
    All optical components will have the advantage of speed over electro-optical (EO) devices the unique advantage optics enjoys over conventional electronics and why this trend will continue.





  • Real Time Operating System

  • ABSTRACT:
    OPERATING SYSTEM
    Operating system is main program that always runs on any computer machine. Operating system is defines as “an intermediary between user & computer hardware.”
    There basic four types of operating systems:
    Single User Systems: Provides a platform for only one user at a time. They are popularly associated with Desk Top operating system which runs on standalone systems where no user accounts are required.
    Multi User Systems: Provides regulated access for a number of users by maintaining a database of known users. Refers to computer systems that support two or more simultaneous users. All mainframes and are multi-user systems, but most personal computers and workstations are not. Another term for multi-user is time sharing.
    An OS can also be described as Multi tasking or Single tasking. In the case of multi tasking one or more programs are allowed to run at a time. In single tasking, only one program can run at a time.
    REAL TIME OPERATING SYSTEM:
    ‘A real time operating system has well-defined, fixed time constraints. Processing must be done within the defined constraints, or the system will fall. A real time operating system is considered to function correctly only if it returns the correct result within any time constraints.’
    A real time operating system is one in which the correctness of the computations not only depends upon the logical correctness but also upon the time at which the result is produced.





  • Modern Communication Systems

  • ABSTRACT:
    Communication is the way in which people interact with each other through some medium like communication devices. These paper tells how was the communication before 20th century. It shows how communication systems are being important in our daily life. For convenience of communication many communication systems are being improvised. Communication systems like Wireless and Mobile communications are most important. Therefore the modern communication systems can be advantageous and useful





  • Nanotechnology-Carbon Nanotubes

  • Introduction:
    Nano technology is often referred to as a general purpose technology. That’s because in its mature form it will have significant impact on all most all industries and all areas of society. It offers better built, longer lasting, cleaner, safer and smarter products for the home, for communications, for medicine, for transportations, for agriculture and for industry in general.
    Nano technology which deals with the devices typically less than 100 Nano meters size, is expected to make a significant contribution to fields of computer storage, semi conductors, bio technology, manufacturing and energy. envisioned are all kinds of amazing products, including extraordinarily tiny computers that are very power full building materials that with stand earth quakes advanced systems during delivery and custom – tailored pharmaceuticals as well as the elimination of invasive surgery because repairs can be made from with in the body.
    An excellent example is that darling of Nano tech world, the carbon Nano tube. Carbon occurs naturally as graphite- the soft black material very often used in pencil leads and as diamond. The only difference between the two is the arrangement of the carbon atoms. When scientists arrange the same carbon atoms into a chicken wire pattern and roll them up into miniscule tubes only 10 atoms across the resulting nanotubes acquire some rather extra ordinary traits.
    Much of the current Nano technology research world wide focuses on these Nano tubes. Scientists have proposed using them for a wide range of applications : In the high strength and low weight materials needed for space applications; as molecular wire for Nano electronics; embedded in micro processors and as tiny rods and gears in Nano – scale machines etc



  • Bio- Cybernics

  • ABSTRACT:
    Man always wanted to scale new heights of technology in order to make his life better than the best. His thirst for sophisticated life has never come to an end. He always demanded more power, more comfort. In this quest we present you the world of bionics which examines the ways in which technology is inexorably driving us to a new and different level of humanity, Transhumanism. As scientists draw on nanotechnology, molecular biology, artificial intelligence, and innovative bio-materials science, they are routinely using sophisticated surgical techniques to implant computer chips and drug-dispensing devices into our bodies, designing fully functional man-made body parts, and linking human brains with computers to make people healthier, smarter, and stronger. In short, we are going beyond what was once only science fiction to create bionic people with fully integrated artificial components and it will not be long before we reach the ultimate goal of constructing a completely synthetic human-like being.
    The central idea of bionics is learning from nature, hence by the bio-mimetic approach we can improve the quality of living. For example an ant can lift a mass i.e. 51 times more than its body weight.



  • 4G - Future Warrior

  • ABSTRACT
    Information is power, nowhere is this truer than on the battlefield, where the ability to communicate clearly and rapidly pass on information spells the difference between survival and death? 4G (4th Generation) is the technology that is going to drive a soldier in the field in future. The key to empowering the military with tactical broadband voice, video and data is 4G communications technology. This technology adopts Wireless technology on the platform of fixed networks, Advanced antennae technologies and More advanced wireless security technologies. Next thing is about the gear for the future warrior. Our system provides a enhanced power of vision, which provides Ground Guidance, Unit Detection, Soldier Status, Target Hand-Off and provides the Soldier Rescue during the battle. The uniform along with the armor, onboard computer which will monitor soldiers' overall physiological and psychological picture of how they are performing in the battle zone and enhanced human performance which weighs 50 pounds from head to toe against 120 pounds of the current day system present.










    SeminarGuru




    Created By




    Amol Patil ,Jalgaon.



    No comments: