Instagram Release Update Account Disabling Mechanism Policy
January 02, 2018
Edit
Instagram recently announced a policy update regarding account deactivation mechanisms. This policy is deliberately created with Facebook to ensure that Instagram can be a supportive medium for everyone. This update also helps Instagram detect and delete accounts that have repeatedly violated policies. Instagram will deactivate an account after the account makes a post that violates Instagram policies until it reaches a certain percentage limit.
Now, in addition to deleting accounts that have posted content containing violations to a certain percentage limit, Instagram's new policy will also delete accounts that have committed a number of violations within a certain period of time.
As with the account deactivation mechanism implemented on Facebook, this update will help Instagram to enforce policies related to account deactivation more consistently and encourage each user to be responsible for the content created.
Not only that, Instagram also introduces a new notification process that can help more conscious users if their account is at risk of being disabled. Through this notification, Instagram will provide an opportunity for users to appeal the deleted content.
For starters, the opportunity for appeal applies to content that contains violations in the form of nudity and pornography, abuse and harassment, utterances of hatred, drug sales, and terrorism. However, Instagram will expand the opportunity to appeal in the next few months.
If the post is deleted because of an error, Instagram will return the post and delete the record of the violation committed. Instagram also always gives the option for users to appeal accounts that are disabled via the Instagram Help Center.
In the coming months, Instagram will also present the ability to appeal directly from the application. This update itself was launched on July 19, 2019 then as an important step so that Instagram remains a safe social media for every user.