‘I want more emotions please’ – Tamil tinseldom relishes this comedy dialogue a camera-wielding Mayilsamy telling Nai Sekar (Vadivelu) who had just landed in the city !!
Yakshagana is a
traditional Indian theatre form, developed in Dakshina Kannada, Udupi, Uttara
Kannada, Shimoga and western parts of Chikmagalur districts, in the state of
Karnataka and in Kasaragod district in Kerala that combines dance, music,
dialogue, costume, make-up, and stage techniques with a unique style and form.
It is believed to have
evolved from pre-classical music and theatre during the period of the Bhakti
movement. Yakshagana is traditionally
presented from dusk to dawn. Its stories are drawn from Ramayana, Mahabharata,
Bhagavata and other epics.
மூன்று நிமிஷம் கணேஷ்னு
ஒரு த்ரில்லர்.. யக்ஷகானத்தில் குண்டு வெடிச்சி ஒரு விஐபியை கொல்ற மாதிரி பிளாட்.. ‘moonru
nimisham Ganesh’ was a brilliant thriller written by Sujatha ~ more about voice
recognition rather than Yakshagana !
Emotions are biological
states associated with all of the nerve systems brought on by
neurophysiological changes variously associated with thoughts, feelings,
behavioural responses, and a degree of pleasure or displeasure. Emotions are
often intertwined with mood, temperament, personality, disposition, creativity,
and motivation. The term ‘emotion’ is
often used to represent a subjective
state displayed via social signals.
Emotion recognition is the process of identifying human emotion. People vary widely in their accuracy at recognizing the emotions of others. Use of technology to help people with emotion recognition is a relatively nascent research area. Generally, the technology works best if it uses multiple modalities in context.
To date, the most work has
been conducted on automating the recognition of facial expressions from video,
spoken expressions from audio, written expressions from text, and physiology as
measured by wearables.
Miles away, we are not
hearing anything about Covid now a days from China. Over there - Xi Jinping
wants ‘positive energy’ but critics say the surveillance tools’ racial bias and
monitoring for anger or sadness should be banned Technology in China can now
supposedly detect one’s state of mind. “Ordinary people in China aren’t happy about this technology
but they have no choice. If the police say there have to be cameras in a
community, people will just have to live with it. There’s always that demand
and we’re here to fulfil it.”
So says Chen Wei at
Taigusys, a company specialising in emotion recognition technology, the latest
evolution in the broader world of surveillance systems that play a part in
nearly every aspect of Chinese society. Emotion-recognition technologies – in
which facial expressions of anger, sadness, happiness and boredom, as well as
other biometric data are tracked – are supposedly able to infer a person’s
feelings based on traits such as facial muscle movements, vocal tone, body
movements and other biometric signals.
It goes beyond
facial-recognition technologies, which simply compare faces to determine a
match. But similar to facial recognition, it involves the mass collection of
sensitive personal data to track, monitor and profile people and uses machine
learning to analyse expressions and other clues.
The industry is booming in
China, where since at least 2012, figures including President Xi Jinping have
emphasised the creation of “positive energy” as part of an ideological campaign
to encourage certain kinds of expression and limit others. Critics say the
technology is based on a pseudo-science of stereotypes, and an increasing
number of researchers, lawyers and rights activists believe it has serious
implications for human rights, privacy and freedom of expression. With the
global industry forecast to be worth nearly $36bn by 2023, growing at nearly
30% a year, rights groups say action needs to be taken now.
The main office of Taigusys is tucked behind a few low-rise office buildings in Shenzhen. Visitors are greeted at the doorway by a series of cameras capturing their images on a big screen that displays body temperature, along with age estimates, and other statistics. Chen, a general manager at the company, says the system in the doorway is the company’s bestseller at the moment because of high demand during the coronavirus pandemic. Chen hails emotion recognition as a way to predict dangerous behaviour by prisoners, detect potential criminals at police checkpoints, problem pupils in schools and elderly people experiencing dementia in care homes.
“Violence and suicide are very common in detention centres,” says Chen. “Even if police nowadays don’t beat prisoners, they often try to wear them down by not allowing them to fall asleep. As a result, some prisoners will have a mental breakdown and seek to kill themselves. And our system will help prevent that from happening.” Chen says that since prisoners know they are monitored by this system – 24 hours a day, in real time – they are made more docile, which for authorities is a positive on many fronts. “Because they know what the system does, they won’t consciously try to violate certain rules,” he says.
Another problem is that
recognition systems are usually based on actors posing in what they think are
happy, sad, angry and other emotional states and not on real expressions of those
emotions. Facial expressions can also vary widely across cultures, leading to
further inaccuracies and ethnic bias.
With regards – S. Sampathkumar
1.5.2021
No comments:
Post a Comment