Instagram has announced changes to its policy related to disabling user accounts, stating that going forward it will remove a greater number of accounts, but will first give users a warning that they’re at risk of the action.

Until now, Instagram’s policy involved the company disabling accounts that contained ‘a certain percentage of violating content.’ Though this policy will remain in place, Instagram says it will also start disabling accounts that contain ‘a certain percentage of violating content within a window of time.’

Additionally, Instagram will start alerting users when their account is at risk of being disabled. The notification includes the content that Instagram removed for violating its guidelines, as well as a list of past violations and a warning that one more removed post may result in the account being deleted.

The same notification will offer users a way to appeal the decision, though appeals will initially be limited to violations involving bullying and harassment, hate speech, nudity and pornography, counter-terrorism policies, and drug sales. ’In coming months,’ Instagram says it will expand the appeals feature to include other issues.

The change follows Facebook’s April meeting, during which time it revealed that Instagram content considered ’inappropriate’ will be demoted on the platform. This demotion applies to content that doesn’t violate Instagram’s Community Guidelines, but that ‘might not be appropriate for our global community,’ the company said at the time.

The content demotion policy has proven controversial with users primarily based on the ambiguous nature of what is considered ’appropriate’ for inclusion on hashtag pages and in Explore. Based on Instagram’s guidelines, it doesn’t appear that this ‘inappropriate’ content will be factored into strikes that may get an account deleted.