There is a current "show your personal site" post on top of HN [1] with 1500+ comments. I wonder how many of those sites are or will be hammered by AI bots in the next few days to steal/scrape content.
If this can be used as a temporary guard against AI bots, that would have been a good opportunity to test it out.
Glad I’m not the only one who felt icky seeing that post.
I agree my tinfoil hat signal told me this was the perfect way to ask people for bespoke, hand crafted content - which of course AI will love to slurp up to keep feeding the bear.
My site is hosted on Cloudflare and I trust its protection way more than flavor of the month method. This probably won't be patched anytime soon but I'd rather have some people click my link and not just avoid it along with AI because it looks fishy :)
Sounds like a useful signal for people building custom agents or models. Being able to control whether automated systems follow a link via metadata is an interesting lever, especially given how inconsistent current model heuristics are.
This is fun. Is it not checking for previously submitted URLs though? I can seemingly re-submit the exact same URL and get a new link every time. I would expect this to fill the database unnecessarily but I have no idea how the backend works.
Am I missing something, or would these essentially be implemented via DNS records? It's not clear to me that keeping the links in a database would be necessary at all (unless the DNS records are what you mean by "database")
It seems appropriate that, for a website whose purpose is to make links which raise your suspicions, the visual design itself also raises your suspicions.
I've been at a company that internally sends out fake links that log the user and links to an educational page on internet safety.
I honestly don't mind too much since it's a once a year thing (hacktober) and honestly companies should be trying to catch out employees who click any and all links.
I added google.com and it spit out https://twitterDOTc1icDOTlink/install_Jy7NpK_private_videoDOTzip
Interesting that it spit out a .zip url. Was not expecting that so I changed all the “.” to “DOT” so I don’t get punished for posting a spammy link despite this literally being a website to make links as spammy and creepy as possible.
I always end up making my own, they're so simple to write.
Saves using one of the "free" ones which looks like its free but you're actually on a free trial, then you can't access your links after that trial expires.
There may actually be some utility here. LLM agents refuse to traverse the links. Tested with gemini-3-pro, gpt-5.2, and opus 4.5.
edit: gpt-oss 20B & 120B both eagerly visit it.
I wish this came a day earlier.
There is a current "show your personal site" post on top of HN [1] with 1500+ comments. I wonder how many of those sites are or will be hammered by AI bots in the next few days to steal/scrape content.
If this can be used as a temporary guard against AI bots, that would have been a good opportunity to test it out.
1. https://news.ycombinator.com/item?id=46618714
Glad I’m not the only one who felt icky seeing that post.
I agree my tinfoil hat signal told me this was the perfect way to ask people for bespoke, hand crafted content - which of course AI will love to slurp up to keep feeding the bear.
I posted my site on the thread.
My site is hosted on Cloudflare and I trust its protection way more than flavor of the month method. This probably won't be patched anytime soon but I'd rather have some people click my link and not just avoid it along with AI because it looks fishy :)
Of course, the downside is that people might not even see your site at all because they’re afraid to click on that suspicious link.
Site should add a reverse lookup. Provide the poison and antidote.
Sounds like a useful signal for people building custom agents or models. Being able to control whether automated systems follow a link via metadata is an interesting lever, especially given how inconsistent current model heuristics are.
Related: A URL shortener not shortening the URL but makes it look very dodgy (434 points, 2023, 100 comments) https://news.ycombinator.com/item?id=34609461
My favorite link of all time:
https://jpmorgan.c1ic.link/logger_zcGFC2_bank_xss.docm
Definitely not meta
Imagine using this as your personal website lol
https://jpmorgan.c1ic.link/G4JQKX_money_request.dll
https://jpmorgan.web-safe.link/flash_7KzCZd_money_request
I love this version and I hope you do too.
well played sir
IIRC, shadyurl was the original version of this. Doesn't seem to be around anymore, though.
shadyurl a whole bunch of different incredibly shady domains that were used at random. it was beautiful.
I'm not sure what the use case for this is, but I've been using it as a inefficient messaging service with my girlfriend, ie:
https://c1ic.link/campaign_WxjLdF_login_page_2.bat
You seem to be able to encode arbitrary text, so long as it follows [A-Za-z0-9]+\.[A-Za-z0-9]+
I like how old-school HN comment section does not care about creepy links at all. Or link for that matter.
This had to be done:
https://wellsfargo.c1ic.link/TODO_obfuscate_url_8wyS7G_hot_s...
This is fun. Is it not checking for previously submitted URLs though? I can seemingly re-submit the exact same URL and get a new link every time. I would expect this to fill the database unnecessarily but I have no idea how the backend works.
Am I missing something, or would these essentially be implemented via DNS records? It's not clear to me that keeping the links in a database would be necessary at all (unless the DNS records are what you mean by "database")
DNS is only for resolving the host part. The path is not passing through a dns query.
In example.com/blah, the /blah part is interpreted by the host itself.
And apart from that I would indeed consider DNS records a database.
Fantastic! I miss the original ShadyURL.
https://news.ycombinator.com/item?id=31386108
Is this suspicious: https://microsoft.c1ic.link/0B7jqd_invoice.vbs ?
For funsies I shortened https://creepylink.com
And got: https://c1ic.link/account_kPvfG7_download_now.bat
I also tried that and got https://twitter.web-safe.link/BUuLrg_document.zip
Squared:
https://c1ic.link/ad_k9OFWW_redeem_gift.bat
Saw this on relaunched Digg and figured HN would appreciate it.
I don't appreciate how AI generated this website looks.
It seems appropriate that, for a website whose purpose is to make links which raise your suspicions, the visual design itself also raises your suspicions.
Just looks like every other generic framework oriented site.
which bit are you getting an AI smell from?
gradient background, card, button
Perhaps, but nearly every tutorial in all the modern frameworks demonstrate this exact style.
Digg is back?
Edit: looks like you need an invite code.
Bummer
I am sharing content using these creepy links to send to office people.
Please take my upvote. :)
Haha, it's fun. Just thinking, is there some place where creepy links would be better ?
I've been at a company that internally sends out fake links that log the user and links to an educational page on internet safety.
I honestly don't mind too much since it's a once a year thing (hacktober) and honestly companies should be trying to catch out employees who click any and all links.
We used to have fun hammering millions of requests to such URLs from a VPS when they would send such emails to role mailboxes.
Eventually we got asked to please make it stop. I asked them to please stop sending fake phishing emails to robots.
Use case? Besides humor and phishing tests
Fun!
For humour I shortened "https://www.facebook.com/"
And got https://twitter.web-safe.link/root_4h3ku0_account_verificati...
I added google.com and it spit out https://twitterDOTc1icDOTlink/install_Jy7NpK_private_videoDOTzip
Interesting that it spit out a .zip url. Was not expecting that so I changed all the “.” to “DOT” so I don’t get punished for posting a spammy link despite this literally being a website to make links as spammy and creepy as possible.
lol, I'm not clicking a .vbs link
It is hilarious and i'm not clicking any link lol.
Please don't make any more URL shorteners, they are just a bad idea.
https://wiki.archiveteam.org/index.php/URLTeam
I always end up making my own, they're so simple to write.
Saves using one of the "free" ones which looks like its free but you're actually on a free trial, then you can't access your links after that trial expires.