Samsung запатентовала новый дизайн смартфона, оснащённого полностью безрамочным экраном, который незначительно загибается со всех четырёх сторон. Фронтальную камеру и датчики компания предлагает интегрировать под экран в верхнюю область дисплея. Больше информации читайте в нашем телеграм-канале. Ссылка в профиле ⬆️
It’s clear that one no longer needs to go on the dark web to solicit and share child sexual abuse material (CSAM). Last year, #facebook said it removed 8.7 million sexually exploitative #images of children in three months. In May this year, #twitter announced that it suspended 4,58,989 accounts for violations related to child sexual exploitation on its platform. And six months ago, #whatsapp said it had removed 1,30,000 accounts in 10 days disbursing child sexual abuse material, often referred to as #childpornography.
It’s clear that one no longer needs to go on the #darkweb (part of the World Wide Web which is not indexed, and therefore cannot be accessed using usual #browsers) to solicit and share child sexual abuse material (CSAM). In fact, #india is one of the biggest contributors and #consumers of CSAM, even though it’s completely illegal.
Creating (real or simulation) as well as storing #csam for commercial purposes is illegal under the #protection of Children from #sexual#offences (#pocso) Act (sections 13-15). Further, the section 67B of the Information #technology Act bars publishing as well as sharing material depicting a #child in a sexually #explicit act in #electronic form. Browsing, downloading, #advertising, promoting, #exchanging and distributing such material in any form is also prohibited under the #act. The maximum #punishment for these offences is #imprisonment of seven years.
Read more: https://www.thenewsminute.com/article/little-awareness-lax-policies-why-child-sex-abuse-photos-videos-persist-online-103699