‘Child friendly’ Tone e20 smartphone that blocks users from taking naked selfies, doesn’t save ‘inappropriate’ images and lets parents check up on their kids is launched in Japan
- The Tone Mobile e20 is specifically designed for children and teenagers to use
- It’s ability to block images can be turned off or on by parents using a passcode
- When enabled it automatically detects naked flesh and turns off the camera
- It sends a pixelated thumbnail of the naked image to parents with the time taken
A new mobile phone that has been branded as ‘child friendly’ is able to block users from taking naked selfies and doesn’t save ‘inappropriate’ images to the phone.
Developed by Japanese smartphone company Tone, the e20 uses artificial intelligence technology to detect potentially x-rated images and block them.
The device, only available in Japan, can also be set up to send an alert with pixelated images to their parent if a child tries to take a picture of their naked body.
The phone has been specifically designed to target a younger audience with a range of parent friendly protective features and a lower price tag at about £180.
Scroll down for video
The Tone Mobile smartphone was designed with children in mind and has a range of features designed to protect younger users – including a camera blocker if it detects naked flesh
The company says it isn’t designed to stop adults sending pictures of themselves to one another, but to protect younger users.
It’s camera app software is similar to technology used by social media companies to alert authorities if someone shares naked images of children.
Despite being aimed at a younger audience it still has a large 6.26 inch display, Android 9.0 and a triple-lens camera similar to bigger budget devices.
Tone says its goal was to develop a device that provided increased security for minors who may be targeted by online scammers and paedophiles.
They say the aim was to protect children from potential coercion tactics where they are tricked or threatened into sending nude pictures of themselves.
Children will be able to simply say ‘my phone won’t allow me to take that picture’.
Even though it won’t save an inappropriate image, the phone can be setup to send a pixelated thumbnail to the parent along with the date, time and GPS information on where it was taken – every time they attempt to capture a naked selfie.
Tone Mobile says it does not retain any of the pixelated thumbnails or images captured from the camera on any other device or server beyond the parents phone.
There are a range of security features available to parents using the Smartphone Protection feature – which is already available without the nudity restriction on older phones made by the Japanese company.
They include the ability to track where your child is, see what websites they visit and get warnings about changes in behaviour.
The device can also be used in schools as it can block certain apps depending on the GPS location of the user – for instance it can block games and social media in school
The child protection feature in the camera software automatically detects naked flesh, blocks the image from being taken and can send a pixelated thumbnail to parents with a location
It uses artificial intelligence to monitor the apps, websites and even real world walking and transport routes a child takes.
If anything significant changes in their behaviour it can warn parents – or if the child doesn’t get the bus they get to school every day it can alert the emergency services.
It is also ‘school friendly’ as certain apps can be locked out depending on location – so if the child is at school it could automatically social media apps and only allow those for learning or talking to parents.
The feature is called Smartphone Protection and is under the control of an adult user who can decide to turn it on or off using a special passcode.
If the phone isn’t for a child and is instead being used by an adult it can be disable – that includes the naked picture blocking – which would allow someone to take photos of themselves in as few or as many clothes as they like.
WHAT DO FACEBOOK’S GUIDELINES FOR CONTENT SAY?
Facebook has disclosed its rules and guidelines for deciding what its 2.2 billion users can post on the social network.
The full guidelines can be read here. Below is a summary of what they say:
1. Credible violence
Facebook says it considers the language, context and details in order to distinguish casual statements from content that constitutes a credible threat to public or personal safety.
2. Dangerous individuals and organisations
Facebook does not allow any organizations or individuals that are engaged in terrorist, organized hate, mass or serial murder, human trafficking, organized violence or criminal activity.
3. Promoting or publicising crime
Facebook says it prohibit people from promoting or publicizing violent crime, theft, and/or fraud. It does not allow people to depict criminal activity or admit to crimes they or their associates have committed.
4. Coordinating harm
The social network says people can draw attention to harmful activity that they may witness or experience as long as they do not advocate for or coordinate harm.
5. Regulated goods
The site prohibits attempts topurchase, sell, or trade non-medical drugs, pharmaceutical drugs, and marijuana as well as firearms.
6. Suicide and self-injury
The rules for ‘credible violence’ apply for suicide and self-injury.
7. Child nudity and sexual exploitation of children
Facebook does not allow content that sexually exploits or endangers children. When it becomes aware of apparent child exploitation, we report it to the National Center for Missing and Exploited Children (NCMEC).
8. Sexual exploitation of adults
The site removes images that depict incidents of sexual violence and intimate images shared without permission from the people pictured.
9. Bullying
Facebook removes content that purposefully targets private individuals with the intention of degrading or shaming them.
10. Harassment
Facebook’s harassment policy applies to both public and private individuals.
It says that context and intent matter, and that the site will allow people to share and re-share posts if it is clear that something was shared in order to condemn or draw attention to harassment.
11. Privacy breaches and image privacy rights
Users should not post personal or confidential information about others without first getting their consent, says Facebook.
12. Hate speech
Facebook does not allow hate speech on Facebook because it says it creates an environment of intimidation and exclusion and in some cases may promote real-world violence.
13. Graphic violence
Facebook will remove content that glorifies violence or celebrates the suffering or humiliation of others.
It will, however, allow graphic content (with some limitations) to help people raise awareness about issues.
14. Adult nudity and sexual activity
The site restricts the display of nudity or sexual activity.
It will also default to removing sexual imagery to prevent the sharing of non-consensual or underage content.
15. Cruel and insensitive
Facebook says it has higher expectations for content that defined as cruel and insensitive.
It defines this as content that targets victims of serious physical or emotional harm.
16. Spam
Facebook is trying to prevent false advertising, fraud and security breaches.
It does not allow people to use misleading or inaccurate information to artificially collect likes, followers or shares.
17. Misrepresentation
Facebook will require people to connect on Facebook using the name that they go by in everyday life.
18. False news
Facebook says that there is also a fine line between false news and satire or opinion.
For these reasons, it won’t remove false news from Facebook, but, instead, significantly reduce its distribution by showing it lower in News Feed.
19. Memorialisation
Facebook will memorialise accounts of people who have died by adding “Remembering” above the name on the person’s profile.
The site will not remove, update or change anything about the profile or the account.
20. Intellectual property
Facebook users own all of the content and information that they post on Facebook, and have control over how it is shared through your privacy and application settings.
21. User requests
Facebook say they will comply with:
- User requests for removal of their own account
- Requests for removal of a deceased user’s account from a verified immediate family member or executor
- Requests for removal of an incapacitated user’s account from an authorised representative
22. Additional protection of minors
Facebook complies with:
- User requests for removal of an underage account
- Government requests for removal of child abuse imagery depicting, for example:
- Beating by an adult
- Strangling or suffocating by an adult
- Legal guardian requests for removal of attacks on unintentionally famous minors
Source: Read Full Article