Controlled, monitored, and audited by who? I believe y’all and appreciate the right approach here but I don’t know any of you and it sounds like a setup for “we investigated ourselves and found no wrongdoing”.
@ucsenoi The owner of this instance is @jerry and you can find a link to the moderation staff within the Wiki here: https://wiki.infosec.exchange/about/moderators
Moderation Team [Infosec.Exchange wiki]

I guess what I meant to ask isn’t exactly who, it’s pointing out that I don’t think having the ones doing the access be the same people who audit for abuse is an inherently trustworthy setup. Either there is a team who has separate incentives and power to hold those with access accountable, or the public must take that role which means there needs to be transparency to empower people to do that.

Anyway it’s not an easy solution especially when it’s all volunteers. I appreciate the work you’re all putting into having responsible moderation here.

@ucsenoi I did try to work on this a bit by working on the Mastodon.py lib with the API in order to create a "transparency" report only to discover there was no way I could do it without basically hammering the API in the process and dealing with rate limiting issues (I have put the project git on my feed). We're definitely open to suggestions/solutions in order to resolve some of the many concerns users have (including us).

@apiratemoo nice!

Something I always appreciated at $work is when an employee accesses your account (internal account, on-platform account, whatever) you get either a message asking you to approve/deny the action, or at least a notification. As a Red Teamer this has caught me a couple times when I thought I was being sneaky only to trigger that and have the target escalate the event to blue team.

I'd like to see that implemented here (inb4 send PR ​). It makes auditors think a little harder before accessing user data, and gives users both peace of mind and recourse.

It'll probably be bypassable if you have direct DB access, so that's not great, but hopefully the number of people with that access is < the number of moderators. Harm reduction.

@ucsenoi All of it is a discourse we need to have and we're aware of that, but also juggling a ton of things on a new platform (I know the Moderator team has been working pretty hard to get as much out as possible). I'm going to make a note, that we should probably add some sort of feedback forum and specific hashtags in order to help engage with everyone on here (this is as much your instance as it is everyone's). More stuff to add to the to do list ​