You are viewing the MafiaScum.net Wiki. To play the game, visit the forum.

MediaWiki talk:Spam-blacklist

From MafiaWiki
Revision as of 10:14, 11 June 2009 by MichelSableheart (talk | contribs) (spam removal)
Jump to navigation Jump to search

Any user can suggest URLs that the see in repeated spam here. SysOps should comment on entries that have been added to the list or declined, to avoid duplication of effort.

Whitelist Suggestions

This can include personal webpages of actual Scummers on troublesome domains, or that happen to contain a word that is often used in spam.

hajl.cheefmsn.com

a subdomain of cheefmsn? It keeps coming no matter what
   – Andycyca (talkcontribs) 23:36, 10.8.2007


The problem with this one is that none of the links on the spammed pages actually go to cheefmsn.com, which I assume is why the filter isn't catching them.
somestrangeflea 00:36, 9 October 2007 (MDT)
Ahah, good catch.
-- Mr. Flay 14:20, 13 October 2007 (MDT)

alphabetical

should we alphabetize the list?.. just seems like it would be easier to not get duplicates if it was organized somehow
   – LyingBrian (talkcontribs) 13:22, 10.18.2007 (MDT)


ok, i went ahead & did this... if there's a reason why it shouldn't be alphabetized, feel free to revert it, adding 'casino' to the list...
   – LyingBrian (talkcontribs) 13:29, 18 October 2007 (MDT)


Nope, order doesn't matter, and that seems like a reasonable way of keeping the list easy to manage...
-- Mr. Flay 13:59, 18 October 2007 (MDT)

funny thing

Hi! Some minutes ago I was updating my userpage and when I hit "Save" it came a warning page saying the the Spam filter blocked my page. Reason? Apparently the link to my deviantart account. screenshot The same happened with my blog link (which is vacuidaddeluz in blogspot) Maybe the filter's looking for URLs with "de". It's not a big deal to me, as I'm not whoring for traffic to my blog, but this means the filter's possibly too good.
Andycyca||me||What? 10:11, 30 October 2007 (MDT)

yeah, i'm assuming it was the same thing... i removed the '.de' protocol from the list, so try it again, now, and it should let you post it... Flay, do you think the first few protocols, i.e. .ru - .to, are a little too broad?..
   – Dani Banani (talkcontribs) 12:52, 10.30.2007 (MDT)


Thanks a lot, I'll try it now. Maybe if ".de" or similar are too common my links should be whitelisted or something. time will tell if .de is potential spam...
Andycyca||me||What? 00:05, 31 October 2007 (MDT)
Yep, that's exactly what the whitelist is for. We can try whitelisting deviantart.com for now and see if that helps...
-- Mr. Flay 00:07, 31 October 2007 (MDT)
i guess i don't understand how the blacklist works... i understand why .deviantart.com was blocked, but i don't understand why http://vacuidaddeluz.blogspot.com was blocked... it's not just blocking addresses that exactly match .de, it blocks anything with de in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't block http://del.icio.us.com, but it would block http://bdelicious.com even though the de isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...
   – Dani Banani (talkcontribs) 01:58, 2 November 2007 (MDT)


No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. We need to escape all of the special characters in the list with a \ to make them read like we want to; I forgot that . itself was a special character.
-- Mr. Flay 11:32, 4 November 2007 (MST)

regular expressions

ok, so i'm trying to come up w/ a regex to block any instance of the word regardless of the character that comes before it... using sample as an example, the closest thing i've found is .+sample... this blocks any character before sample except for [ ] " < > and a single or double slash (/ or //), but it would block a triple slash... of course just adding sample to the blacklist would block http://sample.com whereas .+sample would not, but http://example-sample.com would not be blocked by sample, but would be blocked by .+sample. if anyone knows how to write a regex that would block both, please advise, as the second example (http://example-sample.com) is what i'm seeing more frequently in the spam pages... of course one option is to add both instances of each word, one that is plain (sample), and one that is nearly wild (.+sample).
   – Dani Banani (talkcontribs) 15:32, 11.5.2007 (MST)


del.icio.us

i really feel that del.icio.us is too broad... some users may want to include a link on their userpage, and i don't think it should be necessary for users to request their specific URL to be whitelisted... i'm not even sure why it was blacklisted in the first place... i'll give this a few days for others to respond, but then i think i will remove it...
   – Dani Banani (talkcontribs) 07:46, 11.19.2007 (UTC)


Removed it from the list for now. Let's see if it becomes a problem again... -- Mr. Flay 10:21, 24 December 2007 (MST)