Talk:Spam blacklist
←Requests and proposals | Spam blacklist | Archives (current)→ |
associated page is used by the MediaWiki Spam Blacklist extension, and lists regular expressions which cannot be used in URLs in any page in Wikimedia Foundation projects (as well as many external wikis). Any meta administrator can edit the spam blacklist; either manually or with SBHandler. For more information on what the spam blacklist is for, and the processes used here, please see Spam blacklist/About.
Please sign your posts with ~~~~ after your comment. This leaves a signature and timestamp so conversations are easier to follow. Completed requests are marked as {{added}}/{{removed}} or {{declined}}, and are generally archived quickly. Additions and removals are logged · current log 2014/08. |
The
- Information
- List of all projects
- Overviews
- Reports
- Wikimedia Embassy
- Project portals
- Country portals
- Tools
- Spam blacklist
- Title blacklist
- Vandalism reports
- Closure of wikis
- Interwiki map
- Requests
- Permissions
- Bot flags
- Logos
- New languages
- New projects
- Username changes
- Usurpation request
- Translations
- Speedy deletions
- snippet for logging
- {{sbl-log|9518635#{{subst:anchorencode:SectionNameHere}}}}
Contents
Proposed additions[edit]
![]() |
This section is for proposing that a website be blacklisted; add new entries at the bottom of the section, using the basic URL so that there is no link (example.com, not http://www.example.com). Provide links demonstrating widespread spamming by multiple users on multiple wikis. Completed requests will be marked as {{added}} or {{declined}} and archived. |
Cross-wiki spam[edit]
hisosoccer.com
hisosoccer.blogspot.com
Multiple additions xwiki.--Glaisher (talk) 11:21, 26 July 2014 (UTC)
cool-fuel.co.uk[edit]
cool-fuel.co.uk
see diff. --Dirk Beetstra T C (en: U, T) 11:06, 4 August 2014 (UTC)
Added --Dirk Beetstra T C (en: U, T) 11:07, 4 August 2014 (UTC)
tiredeal.co.il[edit]
tiredeal.co.il
Spambot on multiple wikis. MER-C (talk) 11:09, 8 August 2014 (UTC)
lovetoknow.com[edit]
lovetoknow.com
Copyvio website full of ads. 109.229.11.82 20:17, 13 August 2014 (UTC)
Declined at this point of time.
- Looking at the global use, it would seem that this is something that should be taken up with sites, rather than asking globally for something that the sites have allowed thus far. I think that you would be best starting the conversation at w:en:Mediawiki talk:Spam-blacklist. — billinghurst sDrewth 08:10, 14 August 2014 (UTC)
Proposed additions (Bot reported)[edit]
![]() |
This section is for domains which have been added to multiple wikis as observed by a bot.
These are automated reports, please check the records and the link thoroughly, it may report good links! For some more info, see Spam blacklist/Help#COIBot_reports. Reports will automatically be archived by the bot when they get stale (less than 5 links reported, which have not been edited in the last 7 days, and where the last editor is COIBot).
|
COIBot[edit]
The LinkWatchers report domains meeting the following criteria:
- When a user mainly adds this link, and the link has not been used too much, and this user adds the link to more than 2 wikis
- When a user mainly adds links on one server, and links on the server have not been used too much, and this user adds the links to more than 2 wikis
- If ALL links are added by IPs, and the link is added to more than 1 wiki
- If a small range of IPs have a preference for this link (but it may also have been added by other users), and the link is added to more than 1 wiki.
COIBot's currently open XWiki reports
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Proposed removals[edit]
![]() |
This section is for proposing that a website be unlisted; please add new entries at the bottom of the section.
Remember to provide the specific domain blacklisted, links to the articles they are used in or useful to, and arguments in favour of unlisting. Completed requests will be marked as {{removed}} or {{declined}} and archived. See also /recurring requests for repeatedly proposed (and refused) removals. Notes:
|
pro-d.ru[edit]
pro-d.ru
fall.pro-d.ru is a fan site of the game The Fall: Last Days of Gaia, there are no other sites in this 2nd level domain. The site got blacklisted because of the way too broad regex \bpro-(?!(goroda|speleo)).*?\.ru\b (originally \bpro-*?\.ru\b) which blocks all domains in .ru zone that start with "pro-". Effectively it blocks every Russian site that has "professional" or "about" it its name.
- We can look to making changes. That said, not sure that fan sites are welcome in the Wikipedias. I know that enWP specifically excludes them. It would be good to have some feedback from ruWP about the proposal to remove, as, from memory, it was a problematic spam time. — billinghurst sDrewth 08:38, 11 July 2014 (UTC)
Timesofbook.com[edit]
timesofbook.com
Anyone can remove timesofbook.com from global blocking list?—The preceding unsigned comment was added by 106.208.60.149 (talk • contribs) .
- I don't think it was incorrect - this was spammed. --Dirk Beetstra T C (en: U, T) 07:45, 3 August 2014 (UTC)
Declined Not blacklisted globally. Blacklisted at English Wikipedia (en:WP:SBL) and Bosnian Wikipedia only; . --Glaisher (talk) 15:07, 3 August 2014 (UTC)
oxfreudian(dot)com[edit]
oxfreudian.com
Could someone kindly please remove the above website from the spamlist? It is Richard M. Waugaman's website which should be listed in an "External links" section on his page. Thank you for your consideration in this matter. Knitwitted (talk) 00:12, 4 August 2014 (UTC)
- No, it was spammed inappropriately. Please ask for whitelisting of a specific link on en.wikipedia (e.g. the index.htm or an about page), see en:MediaWiki talk:Spam-whitelist.
Declined --Dirk Beetstra T C (en: U, T) 06:03, 4 August 2014 (UTC)
Troubleshooting and problems[edit]
t.co incorrectly blocking[edit]
t.co
t.co is a url shortner, so blocking this is necessary. But, some areas using .co as a second level domain are involved in this blacklist. For example, Japanese company Kinki Nippon Tourist Individual Tour Sales Co., Ltd. (近畿日本ツーリスト個人旅行販売) has a domain www.knt-t.co.jp , but this cannnot be linked now.--Jkr2255 (talk) 03:41, 25 February 2014 (UTC)
-
Status: DoneI was able to make the regex more specific for the shortener — billinghurst sDrewth 09:28, 25 February 2014 (UTC)
User:Billinghurst - this needs to be done differently, as t.co was now linkable: see diff. I have undone this adaptation and returned to \bt\.co\b for now, please adapt it to something that does solve the problem. --Dirk Beetstra T C (en: U, T) 16:19, 27 April 2014 (UTC)
https://www.mediawiki.org/wiki/Extension:SpamBlacklist#Usage is obviously wrong. t.co has been added 70 times since the change of the rule, the ones I checked typical redirects which should have been blocked. --Dirk Beetstra T C (en: U, T) 16:29, 27 April 2014 (UTC)
- bugzilla:64541 — billinghurst sDrewth 11:08, 28 April 2014 (UTC)
-
- The bugzilla suggested
(?<!-)\bt\.co\b
but this did not prevent the addition of twitter links. — billinghurst sDrewth 13:10, 23 July 2014 (UTC)
- The bugzilla suggested
-
-
Closed Should be possible to add it now. Special:Diff/9315971. --Glaisher (talk) 16:56, 26 July 2014 (UTC)
-
kochi.com incorrectly blocking[edit]
kochi.com
Just like above discussion. Sites related to Kochi prefecture or Kochi city (in Japan) sometimes use ***-kochi.com, which cannot be linked.--Jkr2255 (talk) 12:07, 21 March 2014 (UTC)
- should be
Done, please test — billinghurst sDrewth 15:40, 21 March 2014 (UTC)
Removed closing as done — billinghurst sDrewth 15:41, 8 June 2014 (UTC)
SBHandler broken[edit]
SBHandler seems to be broken - both Glaisher and I had problems that it stops after the closing of the thread on this page, but before the actual blacklisting. Do we have someone knowledgeable who can look into why this does not work? --Dirk Beetstra T C (en: U, T) 04:08, 30 April 2014 (UTC)
User:Erwin - pinging you as the developer. --Dirk Beetstra T C (en: U, T) 04:16, 30 April 2014 (UTC)
- FYI when you created this section with the name "SBHandler", you prevented SBHandler from being loaded at all (see MediaWiki:Gadget-SBHandler.js "Guard against double inclusions"). Of course, changing the heading won't fix the original issue you mentioned. But at least it will load now. PiRSquared17 (talk) 15:30, 18 June 2014 (UTC)
Discussion[edit]
COIBot / LiWa3[edit]
I am busy slowly restarting COIBot and LiWa3 again - both will operate from fresh tables (LiWa3 started yesterday, 29/12/2013; COIBot started today, 30/12/2013). As I am revamping some of the tables, and they need to be regenerated (e.g. the user auto-whitelist-tables need to be filled, blacklist-data for all the monitored wikis), expect data to be off, and some functionality may not be operational yet. LiWa3 starts from an empty table, which also means that autodetection based on statistics will be skewed. I am unfortunately not able to resurrect the old data, that will need to be done by hand). Hopefully things will be normal again in a couple of days. --Dirk Beetstra T C (en: U, T) 17:26, 30 December 2013 (UTC)
Change in functionality of spam blacklist[edit]
Due to issues with determining the content of parsed pages ahead of time (see bugzilla:15582 for some examples), the way the spam blacklist works should probably be changed. Per bugzilla:16326, I plan to submit a patch for the spam blacklist extension that causes it to either delink or remove blacklisted links upon parsing, or replace them with a link to a special page explaining the blacklisting. This could be done either in addition to or instead of the current functionality. Are there any comments or suggestions on such a new implementation? Jackmcbarn (talk) 20:45, 3 March 2014 (UTC)
- Hi!
- I suggest, not to replace the current functionality, and will give an example for this:
- In local wikis like w:de, we sometimes have the situation that we want to prevent people from using certain a domain like "seth-enterprises.example.org" everywhere in article namespace with exception of just one article (the one about the institution, e.g. "seth enterprises"). So in this case we remove all links to that domain from w:de, but we place a link to the domain in one article. Afterwards we blacklist the domain, such that nobody can add the link somewhere. In the certain article the link should still work.
- Could we cope with this scenario, if the SBL functionality was changed? -- seth (talk) 15:25, 15 June 2014 (UTC)
- @Jackmcbarn: I think that would break legitimate links on a wiki (sometimes a site is used minimally in a good way, e.g. in references, but massively spammed and abused further. It then gets blacklisted.
- @Lustiger Seth: such links are better of specifically whitelisted. On en.wikipedia, we would whitelist the landing page ('seth-enterprises.example.org/index.htm') or the about-page (often the index.htm is 'invisible', forcing us to, in principle, whitelist the domain only, and that would open up the abuse possibility again if the problem was the linking of the domain only). In rare cases, we would whitelist the domain only. De-blacklisting, linking, and re-blacklisting is not a real solution - there are edit-scenarios where the only solution for repair is to de-blacklist again, repair, and re-blacklist. For an uninterupted edit-experience, it is better that for all blacklisted links a whitelisting solution is found. --Dirk Beetstra T C (en: U, T) 03:28, 19 June 2014 (UTC)
- Hi!
- White listing does not help in many of the mentioned cases, because the url of the spammers can be the same as the url that is needed in an article. If there is a better soulution, plese tell me. The edit filter could of course be used for a combination of a link-block with a specific article exception. But we try to not use the edit filter for performance reasons (if we would not do this, the edit filter would not work properly). -- seth (talk) 09:54, 19 June 2014 (UTC)
- whitelisting of the type of 'http://seth-enterprises.example.org/index.htm' has on en.wiki never resulted in problems, and 'http://seth-enterprises.example.org/about.htm' neither. In fact, heavily abused websites have their index.htm's and/or about.htm's whitelisted, and are still not abused. --Dirk Beetstra T C (en: U, T) 10:51, 19 June 2014 (UTC)
- We would of course not whitelist 'http://seth-enterprises.example.org' - that would open up everything, and have an end-of-string delimiter also does not help, as the main-domain is generally what is abused. --Dirk Beetstra T C (en: U, T) 11:14, 19 June 2014 (UTC)