#electronic Instagram Photos & Videos

electronic - 4318568 posts

Latest Instagram Posts

  • mediacon - MediaCon @mediacon 18 minutes ago
  • It’s clear that one no longer needs to go on the dark web to solicit and share child sexual abuse material (CSAM). Last year, #Facebook said it removed 8.7 million sexually exploitative #images of children in three months. In May this year, #Twitter announced that it suspended 4,58,989 accounts for violations related to child sexual exploitation on its platform. And six months ago, #WhatsApp said it had removed 1,30,000 accounts in 10 days disbursing child sexual abuse material, often referred to as #childpornography.

It’s clear that one no longer needs to go on the #darkweb (part of the World Wide Web which is not indexed, and therefore cannot be accessed using usual #browsers) to solicit and share child sexual abuse material (CSAM). In fact, #India is one of the biggest contributors and #consumers of CSAM, even though it’s completely illegal.

Creating (real or simulation) as well as storing #CSAM for commercial purposes is illegal under the #Protection of Children from #Sexual #Offences (#POCSO) Act (sections 13-15). Further, the section 67B of the Information #Technology Act bars publishing as well as sharing material depicting a #child in a sexually #explicit act in #electronic form. Browsing, downloading, #advertising, promoting, #exchanging and distributing such material in any form is also prohibited under the #Act. The maximum #punishment for these offences is #imprisonment of seven years.

Read more: https://www.thenewsminute.com/article/little-awareness-lax-policies-why-child-sex-abuse-photos-videos-persist-online-103699

#chilsexualabuse It’s clear that one no longer needs to go on the dark web to solicit and share child sexual abuse material (CSAM). Last year, #facebook said it removed 8.7 million sexually exploitative #images of children in three months. In May this year, #twitter announced that it suspended 4,58,989 accounts for violations related to child sexual exploitation on its platform. And six months ago, #whatsapp said it had removed 1,30,000 accounts in 10 days disbursing child sexual abuse material, often referred to as #childpornography. It’s clear that one no longer needs to go on the #darkweb (part of the World Wide Web which is not indexed, and therefore cannot be accessed using usual #browsers) to solicit and share child sexual abuse material (CSAM). In fact, #india is one of the biggest contributors and #consumers of CSAM, even though it’s completely illegal. Creating (real or simulation) as well as storing #csam for commercial purposes is illegal under the #protection of Children from #sexual #offences (#pocso) Act (sections 13-15). Further, the section 67B of the Information #technology Act bars publishing as well as sharing material depicting a #child in a sexually #explicit act in #electronic form. Browsing, downloading, #advertising, promoting, #exchanging and distributing such material in any form is also prohibited under the #act. The maximum #punishment for these offences is #imprisonment of seven years. Read more: https://www.thenewsminute.com/article/little-awareness-lax-policies-why-child-sex-abuse-photos-videos-persist-online-103699 #chilsexualabuse
  • It’s clear that one no longer needs to go on the dark web to solicit and share child sexual abuse material (CSAM). Last year, #facebook said it removed 8.7 million sexually exploitative #images of children in three months. In May this year, #twitter announced that it suspended 4,58,989 accounts for violations related to child sexual exploitation on its platform. And six months ago, #whatsapp said it had removed 1,30,000 accounts in 10 days disbursing child sexual abuse material, often referred to as #childpornography. It’s clear that one no longer needs to go on the #darkweb (part of the World Wide Web which is not indexed, and therefore cannot be accessed using usual #browsers) to solicit and share child sexual abuse material (CSAM). In fact, #india is one of the biggest contributors and #consumers of CSAM, even though it’s completely illegal. Creating (real or simulation) as well as storing #csam for commercial purposes is illegal under the #protection of Children from #sexual #offences (#pocso) Act (sections 13-15). Further, the section 67B of the Information #technology Act bars publishing as well as sharing material depicting a #child in a sexually #explicit act in #electronic form. Browsing, downloading, #advertising, promoting, #exchanging and distributing such material in any form is also prohibited under the #act. The maximum #punishment for these offences is #imprisonment of seven years. Read more: https://www.thenewsminute.com/article/little-awareness-lax-policies-why-child-sex-abuse-photos-videos-persist-online-103699 #chilsexualabuse
  • 4 0