social media giant Facebook is making plans to improve members' security and improve the reporting system that is currently being used. Facebook was developed in 2004 by Mark Zuckerberg while he was still attending college. Since its launch, it is estimated that there 1.94 billion monthly users on FB to date and that number continues to grow. With that growth, many issues have occurred among users with security and safety and it has been a concern.

In recent years there have been reports of crimes, such as murders, growing hate, religious extremists, suicides, cyber bullying, harassment and trolling.

In the wake of all this Facebook has decided to hire an additional 3,000 people to monitor facebooks live stream, bringing the current numbers of employees to 7,500. It is the hope that monitors will be able to stop a live feed, during an act of violence.

Security and safety

In a statement by Mark Zuckerberg in regards to the reporting system for Facebook, he said, "We are building better tools to keep the community safe." He continued, saying, "We're making it easier to report problems and faster for reviewers to determine which posts violate Facebook standards," Zuckerberg said. The reporting system also includes law enforcement options if members feel the need that they are someone else could be in danger.The company is also launching artificial intelligence Fb Bots into groups.

It is the hope that these bots can do many things, including flagging problematic members.

Facebook has set community guidelines that all members have been made aware. They are informed to use real names, no harassment or bullying, no pornography, no hate speeches, and have set age limits for users. They also have optioned that including blocking members.

Even with implantation of these improvements, many members feel it is not enough. They also believe that Facebook is being biased in their reporting system and plays favoritism with certain members, groups, and pages across the platform. Members throughout Facebook have also complained that they have been suspended for issues and yet other members who have broken rules and have said horrible things to others get ignored.

Facebook does have a responsibility for allowing members free speech. Members might not always like what other people say or their opinions but they do have a right to express themselves. Many users feel that Facebook itself has violated the right to free speech. There is an invisible line, however, when discussions turn into hate speeches, bullying, and harassment, many users feel more should and can be done about it.


Many users talked about their view to the improvements being made regarding the social media giant. Users who were spoken to all expressed the same feelings and issues being expressed.

Mark Mcnally, who is a daily user and Facebook member since 2006 stated."I use to regularly report some of the nastiest vile racism and bigotry and was also told they hadn't broken any community standards but I got a 24 hours ban for calling a woman stupid," McNally said.

Other Facebook users have made similar complaints. Bella Donna Marie another long time FB commented that she has noticed things going on. Bella Donna stated, "I can tell that if any free thought as a conservation or modern patriot they ban your video, ban you and remove pages," she said.

Peter Hugo, also said, "How can FB improve on reporting when they allow hate speeches to Israel but as soon as you say something about them you get banned.

Zakir Khan Tareen talked about his concerns regarding FB and said," Fb has its pros and cons. Fb is a great source for sharing information and allowing people to have a voice from all over the world who might not be able to have one. Tareen also believes that intensifying religious actives have brought out fanatics and bigots more easily.


Administrators and moderators in some groups on FB have also played a hand in the mistreatment of members. They do nothing about members posting things that do go against community standards and in some cases participate in the harassment, in and outside the group's environment. They themselves also think they are above community standards and can set their own rules.

In these groups, administrators, and moderators feel they have a right to tell members to unblock others and not blocked administrators. Administrators positions can be needed at any time in groups and they ask some members if they want the position. Some ask members who are known trolls to be in charge, even if a person in a group has done nothing wrong a member is removed from the group.

The people who are running the groups take no responsibilities for keeping people safe and happy. They do not care the reasoning and think they can supersede community guideline rules. Facebook also has a safeguard system for administrators and monitors to see blocked members as long as they post in the group. Some groups set rules so that groups can run more efficiently and effectively and run their groups are unbiased and fair with members.

Mark Zuckerberg last week changed Facebook's mission statement to, "give people the power to build community and bring the world closer." Zuckerberg wants to give groups more support and help groups and believes that it is possible for groups to change the world.