You are viewing the MafiaScum.net Wiki. To play the game, visit the forum.

MediaWiki talk:Spam-blacklist: Difference between revisions

From MafiaWiki
Jump to navigation Jump to search
(0.046242774566474)
Line 10: Line 10:
<a href=" http://www.epicweapons.com/connect/profiles/101322 ">18qt</a> jxk <a href=" http://www.epicweapons.com/connect/profiles/101323 ">full version 2girls1cup</a> vcgfn
<a href=" http://www.epicweapons.com/connect/profiles/101322 ">18qt</a> jxk <a href=" http://www.epicweapons.com/connect/profiles/101323 ">full version 2girls1cup</a> vcgfn


== funny thing ==
<a href=" http://www.epicweapons.com/connect/profiles/101301 ">mpgs lolita</a> uwjwnc <a href=" http://www.epicweapons.com/connect/profiles/101302 ">cute preteen sweeties</a> xbsr <a href=" http://www.epicweapons.com/connect/profiles/101303 ">nest lolita</a> dkd <a href=" http://www.epicweapons.com/connect/profiles/101304 ">underage pussy sex</a> ppsb
 
Hi! Some minutes ago I was updating my userpage and when I hit "Save" it came a warning page saying the the Spam filter blocked my page. Reason? Apparently the link to my deviantart account. [http://img456.imageshack.us/img456/9861/blockedmepw9.png screenshot] The same happened with my blog link (which is vacuidaddeluz in blogspot) Maybe the filter's looking for URLs with "de". It's not a big deal to me, as I'm not whoring for traffic to my blog, but this means the filter's possibly too good. <br>
<span style="color:#336600">Andycyca</span>||[[User:Andycyca|<span style="color:#006633">me</span>]]||[[User_talk:Andycyca|<span style="color:green">What?</span>]] 10:11, 30 October 2007 (MDT)
 
:yeah, i'm assuming it was the same thing... i removed the '.de' protocol from the list, so try it again, now, and it should let you post it... Flay, do you think the first few protocols, i.e. .ru - .to, are a little too broad?..{{sig|Dani Banani|12:52, 10.30.2007 (MDT)}}
 
::Thanks a lot, I'll try it now. Maybe if ".de" or similar are too common my links should be whitelisted or something. time will tell if .de is potential spam...
::<span style="color:#336600">Andycyca</span>||[[User:Andycyca|<span style="color:#006633">me</span>]]||[[User_talk:Andycyca|<span style="color:green">What?</span>]] 00:05, 31 October 2007 (MDT)
 
:Yep, that's exactly what the whitelist is for. We can try whitelisting deviantart.com for now and see if that helps...
:-- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 00:07, 31 October 2007 (MDT)
 
::i guess i don't understand how the blacklist works... i understand why <code>.deviantart.com</code> was blocked, but i don't understand why <code><nowiki>http://vacuidaddeluz.blogspot.com</nowiki></code> was blocked... it's not just blocking addresses that exactly match <code>.de</code>, it blocks anything with <code>de</code> in the URL unless it doesn't come after a period and is the first letter... for example, it doesn't block <code><nowiki>http://del.icio.us.com</nowiki></code>, but it would block <code><nowiki>http://bdelicious.com</nowiki></code> even though the <code>de</code> isn't immediately proceeded by a period... so with this additional information, i can't help but repeat my concerns that the first few items on the blacklist are too broad... it may not be that big of a problem right now, as we can add exceptions to the whitelist, but i foresee it becoming a bigger problem, so instead of continuing to increase the number of exceptions to the rule, i think we should make the items on the blacklist more specific, especially the first few...{{sig|Dani Banani|01:58, 2 November 2007 (MDT)}}
 
:::No, you're right, there was a problem with using the . in the list, it just took me a while to remember why. We need to escape all of the special characters in the list with a \ to make them read like we want to; I forgot that . itself was a special character.
:::-- [[User:Mr. Flay|Mr.]] [[User_talk:Mr. Flay|Flay]] 11:32, 4 November 2007 (MST)


== regular expressions ==
== regular expressions ==

Revision as of 00:31, 13 June 2009

Any user can suggest URLs that the see in repeated spam here. SysOps should comment on entries that have been added to the list or declined, to avoid duplication of effort. <a href=" http://forums.prototype-ui.com/users/304 ">preteen sex forum</a> emv <a href=" http://forum.gina.alaska.edu/users/250 ">real child porn</a> nesjgz

<a href=" http://www.epicweapons.com/connect/profiles/101310 ">couple fuck foursome</a> ary <a href=" http://www.epicweapons.com/connect/profiles/101311 ">you tube porn</a> clhuvj

<a href=" http://www.epicweapons.com/connect/profiles/101243 ">illegal preteen sites</a> xvkuz <a href=" http://www.epicweapons.com/connect/profiles/101244 ">preteens girls</a> bmfww <a href=" http://www.epicweapons.com/connect/profiles/101245 ">lolita image boards</a> cfm <a href=" http://www.epicweapons.com/connect/profiles/101246 ">preteen voyeur</a> qjvsg <a href=" http://www.epicweapons.com/connect/profiles/101247 ">preteen nudes xxx</a> lzgel

<a href=" http://www.epicweapons.com/connect/profiles/101322 ">18qt</a> jxk <a href=" http://www.epicweapons.com/connect/profiles/101323 ">full version 2girls1cup</a> vcgfn

<a href=" http://www.epicweapons.com/connect/profiles/101301 ">mpgs lolita</a> uwjwnc <a href=" http://www.epicweapons.com/connect/profiles/101302 ">cute preteen sweeties</a> xbsr <a href=" http://www.epicweapons.com/connect/profiles/101303 ">nest lolita</a> dkd <a href=" http://www.epicweapons.com/connect/profiles/101304 ">underage pussy sex</a> ppsb

regular expressions

ok, so i'm trying to come up w/ a regex to block any instance of the word regardless of the character that comes before it... using sample as an example, the closest thing i've found is .+sample... this blocks any character before sample except for [ ] " < > and a single or double slash (/ or //), but it would block a triple slash... of course just adding sample to the blacklist would block http://sample.com whereas .+sample would not, but http://example-sample.com would not be blocked by sample, but would be blocked by .+sample. if anyone knows how to write a regex that would block both, please advise, as the second example (http://example-sample.com) is what i'm seeing more frequently in the spam pages... of course one option is to add both instances of each word, one that is plain (sample), and one that is nearly wild (.+sample).
   – Dani Banani (talkcontribs) 15:32, 11.5.2007 (MST)


del.icio.us

i really feel that del.icio.us is too broad... some users may want to include a link on their userpage, and i don't think it should be necessary for users to request their specific URL to be whitelisted... i'm not even sure why it was blacklisted in the first place... i'll give this a few days for others to respond, but then i think i will remove it...
   – Dani Banani (talkcontribs) 07:46, 11.19.2007 (UTC)


Removed it from the list for now. Let's see if it becomes a problem again... -- Mr. Flay 10:21, 24 December 2007 (MST)