Share to:

Machine perception

Machine perception is the capability of a computer system to interpret data in a manner that is similar to the way humans use their senses to relate to the world around them.[1][2][3] The basic method that the computers take in and respond to their environment is through the attached hardware. Until recently input was limited to a keyboard, or a mouse, but advances in technology, both in hardware and software, have allowed computers to take in sensory input in a way similar to humans.[1][2]

Machine perception allows the computer to use this sensory input, as well as conventional computational means of gathering information, to gather information with greater accuracy and to present it in a way that is more comfortable for the user.[1] These include computer vision, machine hearing, machine touch, and machine smelling, as artificial scents are, at a chemical compound, molecular, atomic level, indiscernible and identical.[4][5]

The end goal of machine perception is to give machines the ability to see, feel and perceive the world as humans do and therefore for them to be able to explain in a human way why they are making their decisions, to warn us when it is failing and more importantly, the reason why it is failing.[6] This purpose is very similar to the proposed purposes for artificial intelligence generally, except that machine perception would only grant machines limited sentience, rather than bestow upon machines full consciousness, self-awareness, and intentionality.

Machine vision

Computer vision is a field that includes methods for acquiring, processing, analyzing, and understanding images and high-dimensional data from the real world to produce numerical or symbolic information, e.g., in the forms of decisions. Computer vision has many applications already in use today such as facial recognition, geographical modeling, and even aesthetic judgment.[7]

However, machines still struggle to interpret visual impute accurately if said impute is blurry, and if the viewpoint at which stimulus are viewed varies often. Computers also struggle to determine the proper nature of some stimulus if overlapped by or seamlessly touching another stimulus. This refers to The Principle of Good Continuation. Machines also struggle to perceive and record stimulus functioning according to the Apparent Movement principle which Gestalt psychologists researched.

Machine hearing

Machine hearing, also known as machine listening or computer audition, is the ability of a computer or machine to take in and process sound data such as speech or music.[8][9] This area has a wide range of application including music recording and compression, speech synthesis, and speech recognition.[10] Moreover, this technology allows the machine to replicate the human brain's ability to selectively focus on a specific sound against many other competing sounds and background noise. This particular ability is called "auditory scene analysis". The technology enables the machine to segment several streams occurring at the same time.[8][11][12] Many commonly used devices such as a smartphones, voice translators, and cars make use of some form of machine hearing. Present technology still occasionally struggles with speech segmentation though. This means hearing words within sentences, especially when human accents are accounted for.

Machine touch

A tactile sensor

Machine touch is an area of machine perception where tactile information is processed by a machine or computer. Applications include tactile perception of surface properties and dexterity whereby tactile information can enable intelligent reflexes and interaction with the environment.[13] (This could possibly be done through measuring when and where friction occurs, and of what nature and intensity the friction is). Machines however still do not have any way of measuring some physical human experiences we consider ordinary, including physical pain. For example, scientists have yet to invent a mechanical substitute for the Nociceptors in the body and brain that are responsible for noticing and measuring physical human discomfort and suffering.

Machine olfaction

Scientists are developing computers known as machine olfaction which can recognize and measure smells as well. Airborne chemicals are sensed and classified with a device sometimes known as an electronic nose.[14][15]

Machine taste

The electronic tongue is an instrument that measures and compares tastes. As per the IUPAC technical report, an “electronic tongue” as analytical instrument including an array of non-selective chemical sensors with partial specificity to different solution components and an appropriate pattern recognition instrument, capable to recognize quantitative and qualitative compositions of simple and complex solutions[16][17]

Chemical compounds responsible for taste are detected by human taste receptors. Similarly, the multi-electrode sensors of electronic instruments detect the same dissolved organic and inorganic compounds. Like human receptors, each sensor has a spectrum of reactions different from the other. The information given by each sensor is complementary, and the combination of all sensors' results generates a unique fingerprint. Most of the detection thresholds of sensors are similar to or better than human receptors.

In the biological mechanism, taste signals are transduced by nerves in the brain into electric signals. E-tongue sensors process is similar: they generate electric signals as voltammetric and potentiometric variations.

Taste quality perception and recognition are based on the building or recognition of activated sensory nerve patterns by the brain and the taste fingerprint of the product. This step is achieved by the e-tongue's statistical software, which interprets the sensor data into taste patterns.

Future

Other than those listed above, some of the future hurdles that the science of machine perception still has to overcome include, but are not limited to:

- Embodied cognition - The theory that cognition is a full body experience, and therefore can only exist, and therefore be measure and analyzed, in fullness if all required human abilities and processes are working together through a mutually aware and supportive systems network.

- The Moravec's paradox (see the link)

- The Principle of similarity - The ability young children develop to determine what family a newly introduced stimulus falls under even when the said stimulus is different from the members with which the child usually associates said family with. (An example could be a child figuring that a chihuahua is a dog and house pet rather than vermin.)

- The Unconscious inference: The natural human behavior of determining if a new stimulus is dangerous or not, what it is, and then how to relate to it without ever requiring any new conscious effort.

- The innate human ability to follow the likelihood principle in order to learn from circumstances and others over time.

- The recognition-by-components theory - being able to mentally analyze and break even complicated mechanisms into manageable parts with which to interact with. For example: A person seeing both the cup and the handle parts that make up a mug full of hot cocoa, in order to use the handle to hold the mug so as to avoid being burned.

- The free energy principle - determining long before hand how much energy one can safely delegate to being aware of things outside one's self without the loss of the needed energy one requires for sustaining their life and function satisfactorily. This allows one to become both optimally aware of the world around them self without depleting their energy so much that they experience damaging stress, decision fatigue, and/or exhaustion.

See also

References

  1. ^ a b c d Malcolm Tatum (October 3, 2012). "What is Machine Perception".
  2. ^ a b c Alexander Serov (January 29, 2013). "Subjective Reality and Strong Artificial Intelligence". arXiv:1301.6359 [cs.AI].
  3. ^ "Machine Perception & Cognitive Robotics Laboratory". www.ccs.fau.edu. Retrieved 2016-06-18.
  4. ^ Cotton2009-03-01T00:00:00+00:00, Simon. "If it smells - it's chemistry". RSC Education. Retrieved 2022-05-03.{{cite web}}: CS1 maint: numeric names: authors list (link)
  5. ^ "Artificial networks learn to smell like the brain". MIT News | Massachusetts Institute of Technology. Retrieved 2022-05-03.
  6. ^ "Machine Perception Research - ECE - Virginia Tech". www.ECE.VT.edu. Archived from the original on March 7, 2021. Retrieved January 10, 2018.
  7. ^ a b Dhar, Sagnik; Ordonez, Vicente; Berg, Tamara L. (2011). "High level describable attributes for predicting aesthetics and interestingness" (PDF). CVPR 2011. pp. 1657–1664. doi:10.1109/CVPR.2011.5995467. hdl:1951/55408. ISBN 978-1-4577-0394-2. S2CID 14609200.
  8. ^ a b Tanguiane (Tangian), Andranick (1993). Artificial Perception and Music Recognition. Berlin-Heidelberg: Springer.
  9. ^ Tanguiane (Tangian), Andranick (1994). "Principle of correlativity of perception and its applications to music recognition". Music Perception. 11 (4): 465–502. doi:10.2307/40285634. JSTOR 40285634.
  10. ^ a b Lyon, Richard (2010). "Machine Hearing: An Emerging Field [Exploratory DSP". IEEE Signal Processing Magazine. 27 (5): 131–139. Bibcode:2010ISPM...27..131L. doi:10.1109/MSP.2010.937498. S2CID 13143070.
  11. ^ Tangian, Andranik (2001). "How do we think: modeling interactions of memory and thinking". Cognitive Processing. 2: 117–151. doi:10.5445/IR/1000133287. S2CID 237995668.
  12. ^ "Machine Perception & Cognitive Robotics Laboratory". ccs.FAU.edu. Retrieved January 10, 2018.
  13. ^ Fleer, S.; Moringen, A.; Klatzky, R. L.; Ritter, H. (2020). "Learning efficient haptic shape exploration with a rigid tactile sensor array, S. Fleer, A. Moringen, R. Klatzky, H. Ritter". PLOS ONE. 15 (1): e0226880. doi:10.1371/journal.pone.0226880. PMC 6940144. PMID 31896135.
  14. ^ "Using artificial intelligence to smell the roses: Study applies machine learning to olfaction with possible vast applications in flavors and fragrances". ScienceDaily. Retrieved 2022-05-03.
  15. ^ Marr, Bernard. "Artificial Intelligence Is Developing A Sense Of Smell: What Could A Digital Nose Mean In Practice?". Forbes. Retrieved 2022-05-03.
  16. ^ Vlasov, Yu; Legin, A.; Rudnitskaya, A.; Natale, C. Di; D'Amico, A. (2005-01-01). "Nonspecific sensor arrays ("electronic tongue") for chemical analysis of liquids (IUPAC Technical Report)". Pure and Applied Chemistry. 77 (11): 1965–1983. doi:10.1351/pac200577111965. ISSN 0033-4545. S2CID 109659409.
  17. ^ Khalilian, Alireza; Khan, Md. Rajibur Rahaman; Kang, Shin-Won (2017). "Highly sensitive and wide-dynamic-range side-polished fiber-optic taste sensor". Sensors and Actuators B: Chemical. 249: 700–707. doi:10.1016/j.snb.2017.04.088.
  18. ^ Turk, Matthew (2000). "Perceptive Media: Machine Perception and Human Computer Interaction" (PDF). Chinese Journal of Computers. 12. pages 1235-1244
Kembali kehalaman sebelumnya