Hacker News new | past | comments | ask | show | jobs | submit login
Link shorteners: the long and short of why you shouldn’t use them (civilservice.gov.uk)
239 points by edent on June 10, 2021 | hide | past | favorite | 145 comments



I worked with an org that ran their own link shortener... and used it for confirmation links! I'm not even kidding, you'd go to reset you password and as expected the link would be something like:

    ourapp.example.com/auth/reset?user=blah&token=1af17e73721dbe0c40011b82ed4bb1a7dbe3ce29eae4997c84600287f8866673d05fdaa1aa841a5a
and they they figured, oh man, those links are unsightly for email, we'd best turn that into something like:

    ourapp.example.com/s/xO8pR
That looks way cleaner in an email.


I hope you've managed to fix this, because this is an obvious security issue. A long token is used precisely because it is long and unguessable. The shortened URL is subject to enumeration attacks which can be used to hijack accounts.


Yeah. When I stumbled across this I had some conversations, with the net result that URLs containing authenticator tokens are no longer shortened :)


> A long token is used precisely because it is long and unguessable

This. So much fun can be had by enumerating link shortener URLs. I've experimented with enumerating some services' URL schema. Most of the time the link pointed to innocuous things like Amazon affiliate links or whatnot. Sometimes you would find interesting content that made you go 'wow!', but that was very rare.


Well, no.

Suppose you have 64-bit one-time identifiers for password reset links. With base-85 encoding, it's 11 characters, short enough to even type manually.

I suppose a password-reset link should expire within an hour. Scanning the entire 64-bit space in an hour in search of a working password-reset link is infeasible: rate-limiting will prevent it, and monitoring will warn about the attempt.

A feasible attack vector could be on the generation algorithm, but I suppose a good link shortener won't use a simple predictable RNG.


> With base-85 encoding

... and you lost your argument.

There's a reason why there's a variant of Base64 designed for URLs.


> because this is an obvious security issue

Not really. Usually password reset tokens are only valid for 10 or 15 minutes. With some basic rate limiting, you can stop a single actor from accessing more than one of those links in 15 minutes.

And even if they work around that, you just ask the user to verify their email address when they click on the link. Being able to enumerate the reset tokens and guess the right email address at the same time is highly unlikely.


Verifying their e-mail address would be useless as attacker would already know the e-mail.

Attacker knowing some existing user email will go to "forgot password" view and type in the e-mail for the user they plan to attack. Then after will start bruteforcing the token.

It is highly unlikely they had rate limiting because they had long tokens there for a reason and most frameworks like Laravel for example which provide similar forgot password feature won't by default rate limit those tokens or at least haven't in the past. I am not up to date with current version of Laravel and I think it may be using signed urls instead. Which would also be obviously terrible if shortened.

So the original team who built forgot pw didn't expect someone in the future to start shortening those urls, so it is unlikely they figured rate limiting to be necessary in this case.

It would require in most cases conscious decision making and effort to specifically rate limit token guesses, likely to be out of scope.

Catch all rate limiting by IP wouldn't work either because it would be arbitrary to use botnet to bruteforce.

But in the OP example the e-mail/user was already in the url so included with the shortened url. In this case hacker could just try random short urls until they hit something and due to redirection also immediately know the e-mail.


> Verifying their e-mail address would be useless as attacker would already know the e-mail.

How?

Everything you said is true for the implementation that was listed, but my point was short URLs for password reset aren't always bad, if other mitigations are in place, which should be in place anyway (rate limiting requests for password reset URLs and requiring verification of the email address).


The attacker would start out with a targeted user's email or login to the site. A personal email address usually is usually public. Start the password recovery process. Use a botnet to try different shortened links. A rate limit on password reset for an account would help if the attacker had low probability of success before the reset link expired, but the attacker can cycle between multiple target accounts.


> Verifying their e-mail address would be useless as attacker would already know the e-mail.

In this scenario, no. We're talking about an attacker who is iterating over the ID space of a URL shortener in order to find a password-reset link that had been URL-shortened.

If that link only contains a token and not the user's email address then the attacker doesn't know the email, and asking them to provide their email on the password reset page would at least make it marginally less bad to url-shorten links containing password reset tokens. (But it's still a bad idea)


My point is that they don't have to iterate over all e-mails as long as they know at least one e-mail of any user, which doesn't seem that far fetched, as they can just simply trigger creation of this token for that particular e-mail themselves. It's especially effective if it's a targeted attack.

But well, yeah in a scenario where you truly have short tokens and someone is not trying to create the token for the e-mail themselves, but just trying to find vulnerable links, then yes, in this case it would help, but that's non-sense, just have longer tokens and you are fine, why force user having to do extra steps, when you can just have longer tokens.


> you can stop a single actor

It's not just unattached performers who are the threat. People of every relations hip status and profession could be attacking.


I doubt they implemented rate limiting though.


Here’s some more baseless guesswork: I am absolutely certain they did.


I said "I doubt", you said "I am absolutely certain". Can you tell the difference?


Effectively there's no difference, both statements are equally worthless.


I doubt he can


On the URL shortener, or on the password reset token endpoint? Because only one of those will save you.


Well, one more place and a simple base 36 alpha/numeric shortened token like this can represent a number up to around two billion so good luck with the enumeration attack - especially if you have a reasonable rate limiting scheme for requests. 2FA tokens are usually even simpler.


The first link is a hex encoded token, 80 chars, 4 bits per char = 320 bits of information. The second shortened link key is likely base 64 encoded, 5 chars, 64 bits per char = 320 bits of information. These should be basically the same from a security perspective. Is there something I'm missing here that you're suggesting?

Edit: This is wrong and should be 30 bits of information not 320 bits for the shortened form. 64 values = 6 bits not 64 bits.


> The second shortened link key is likely base 64 encoded, 5 chars, 64 bits per char

No, 64 values – or 6 bits. 6×5=30, not 320.


Ah, RIP, my math is incorrect. Thanks.


base64 -> 64 possibilities -> 6 bits I guess?


We also don't know if the shortened urls are random.


I think it's usually base62.


You can also use an alternate alphabet like this one https://base64.guru/standards/base64url which replaces +/ with -_ which are URL safe characters.


Probably is. Base64 includes + and / which I believe need to be URL encoded, plus the = padding can mean an extra step if you want to remove them to make the URL pretty.

https://en.wikipedia.org/wiki/Base64#Base64_table



> ourapp.example.com/s/xO8pR

Wow. And makes it easier to brute-force (Which I think you're insinuating). If the links have an auto-expire of 10 minutes, is the risk sufficiently mitigated? Or am I missing something else?


Require the user to enter their user id again.


Wouldn't help much with it right there in the URL too.

Even if it weren't, triggering a password reset for somebody and then fuzzing the url space ends up working pretty quick. I saw a few comments talking about rate limiting but since these are by definition unauthenticated requests, that's not going to help you either.

Short version (heh) is you cannot make this secure.


Clearly the user id would be removed from the URL, and rate limiting can be done by IP address.


…and make sure that the original URL doesn’t include the user ID anywhere - it did in OP’s original example, which means that any attacker could scrape the ID just by watching what the redirect went to (assuming a normal link shortening service was used)


Right, that was implied.

You'd need to rate-limit the shortened URL endpoint as well or increase the number of characters. Without it, you could reset a user's password and brute force all shortened possibilities while entering their username. There'd be enough red flags to identify and stop this type of behavior, I think.


How would you rate-limit an unauthenticated-by-definition endpoint?


By IP. It doesn't have to be limited by much, just enough to discourage brute force attempts.


Assuming any of it is being actively monitored.


Also, I never click on links in e-mails directly. For something like this I'd cut and paste the address it seems Google puts another layer of redirection in Gmail to spy on you ("data-saferedirecturl", whatever that does in their JS)


It's a valve they can shut off when the targets are detected to be phishing or malware, so the link breaks.

And, of course, tracking.


Since they control the rendering, they can shut it off by not hyperlinking the link or displaying a warning next to it, they don't need to put an always-on tracking mechanism in place that sends them click data even when the link is not determined to be malware.


I imagine that many organizations would like to know which of their employees did click a link that turns out to be malicious, so that the company can check those employees' computers for malware. Tracking could be useful for determining the severity of the damage done by a successful phishing attack.


For corporations on corporate computers fine, but this is Google tracking personal accounts.


Gmail can track when you cut the URL...


ah yes, let me instead manually type the url into a new chrome tab, google will have no idea I went there.


I'm not sure if this is still the case but there was a period of time when email clients would not transform the entire URL into a clickable link if it was too long. These were generally email clients which supported plain text only.

Anyway, I think that I'd be okay with a shortener like your example as long as it:

1. Required me to enter my user id again 2. Was only valid for 30 minutes


Outlook desktop client still has problems with really long links in that although they remain clickable, they don't actually go anywhere.


Yeah, we do that. It's simple and easy, allows us to tweak the destination of a published link if the site shuffles, lets us print simple short links that are quick to type but still are obviously our own domain.

You can't do it for everything... for one thing, you don't gain search engine karma from the links. But it's often very useful.


Do you also do that for links with secret tokens like the reset password link the op mentioned? Because -spoiler- that makes those links very easy to guess/brute force


No, I should have made that clear. Don't do it for links including tokens, user accounts, or anything like that. (Obviously.) Only on links you'd put out for mass consumption.

Still, it can eliminate lots of unsightly cruft in a link. It can replace this: https://thisonecompany.com/productinfo/specs/?sku=sdfdf432&f...

with: https://this.co/proddeets


There are many valid use cases for URL shortener, I am not sure if this is one of them.

IMHO, this is a display layer issue that only affects human eyes, and should be handled on display layer (html, email rendering...etc) -- just don't display the whole URL somehow. Machines won't have any problem processing that long link.


Some platforms do this. My understanding is that they're especially motivated by the fact that many people don't really distinguish between mybank.com/changepassword and mybank.attacker.com/changepassword.

However, it really infuriates some vocal technical folks (e.g. https://www.androidpolice.com/2020/08/13/google-resumes-its-...). I think the compromise is good: hide the full URL by default, but have a setting or some affordance to show the full thing to people who do enjoy looking at it.


The title of this should really be "why you shouldn't use 3rd party link shorteners". There are lots of good reasons to use internal shorteners (and this article even ends by telling their own users to use their internal gov.uk link shortener).

At reddit we had a link shortener (redd.it) that was automatic for every post, which was useful for posting on social media, especially twitter, when the limit was 140 charters. There are lots of other uses for internal link shorteners too, like just having nicer URLs for publishing or saying out loud.

But yes, the article is totally right about 3rd party link shorteners.


lol but look what they have to go through for their own shortener:

> You can request a short URL if you’re the GOV.UK lead or a managing editor in your organisation.

> Submit a request for a short URL at least 2 weeks before you need it. GDS might not be able to meet your deadline if we do not get the full 2 weeks notice.


Do note that this isn't just a short link like with, say, bit.ly, but a vanity link like https://www.gov.uk/brexit-eucitizens , which means you actually need to check them for validity before assigning them.


Heh it's still the government. :)


I encouraged Quora to ban link shorteners. They were heavily used for spam and malware, avoiding whatever (meager) anti-spam mechanisms they were using. By "heavily" I mean "exclusively", though it's conceivable that somebody some time was using it legitimately.

They never did implement that, but it sounds like it might be a good general rule for many web sites that accept and display content from users. If you're concerned about the way long links appear you can abbreviate them on the screen (the way HN does).


Turns out having your short URLs too short can also be problematic: https://arxiv.org/pdf/1604.02734v1.pdf

An example in this paper cites the shortener used by Google Maps. The researchers were able to enumerate all the short links by brute force and join destinations from specific residential addresses. This is scary because now you've essentially created all points of interest that 1 person visits (originating from their home address).

Google's response was to expand their URL tokens from 5 characters to 12. The sparseness makes it uneconomical for someone to brute-force their way through. Microsoft OneDrive's response was... interesting.


This is giving me pause to think on when you want short and dense pattern spaces, and when you want sparse spaces.

Published articles meant to be accessed publicly seem like a case for the former. The idea is for those references to be found, and a search space which is both predictable and small is preferred. Here I tend to like schemes such as:

   example.com/yyyy/mm/dd/nnnn.../<optional descriptive>
That is, for temporal data, explicitly code in the year, month, and day (and finer gradations of time if appropriate), then an item number (possibly sequenctial). The optional descriptive text might incluce author(s) and title(s).

Dates aren't always required. Some well-known cases (comparatively) are Amazon's reliance on SCU, iBiblio's reliance on ISBN, and Worldcat's reliance on OCLC. (You can omit all other index elements on the URL to obtain the desired result.)

Sparse spaces tend to be for non-published / non-public entities and docucments. Google+ in particular had a 20--21 digit numeric userid (apparently used within Google as the account UUID). Even with some 3--4 billion registered profiles (the vast majority auto-created through Android device registrations), the space was sparse to a ration of trillions (and higher when interest was focused on the only the 100--300 million or so active accounts). This had a huge impact on the ability to crawl the space efficiently, as a brute-force search would have taken some time. Fortunately, Google provided sitemaps....

A related concept is James C. Scott's notion of legibility (from Seeing Like a State), and where it is and is not advantageous, and for whom.



For these who prefer non-PDF arxiv link: https://arxiv.org/abs/1604.02734


In my company we created our own link shortener using AWS S3.

... just create an S3 bucket with a short domain, configure it for static web hosting, and upload empty files which have the "Redirect" metadata property set to the destination URL. Voila!

You won't have analytics (maybe this can be configured via AWS, but I can't say) but you don't need a server either.

I want to eventually create a friendly control panel to create and delete shortcuts using React, AWS Lambda and Cognito... but I still haven't had time... and we only need to add a handful of short links per year. This can also be scripted and done quickly through the CLI.


> ... just create an S3 bucket with a short domain, configure it for static web hosting, and upload empty files which have the "Redirect" metadata property set to the destination URL. Voila!

Heavy “Dropbox is just cvs mounted over ssh, easy!” vibes over here.


Cloudflare's Workers/KV is pretty ideal for a link shortener. There's a small bit of js to write, but the KV database is just short->long and it's cached at the edge. And it's either free (< 100,000 requests/day) or $5 for 10 million requests.

And the admin panel provides a simple way to edit the KV database, so you don't have to write a db editor.


Note that Cloudflare Workers run before the cache unless you get creative (you basically need a second Cloudflare domain configured in front of your workers). For something as simple as a URL shortener it may not be critical but it does mean that you are paying for every request which can add up for a popular link.


Ah, I was talking about the other cache...the KV cache. Meaning that the short->long mapping is cached for performance reasons, so it's an eventually consistent, distributed, link shortener.

But, yes, not free if you exceed 100k requests/day. $5 per 10 million requests beyond that.

The idea of fronting it with the actually free regular cache is interesting. There is an API to control that "regular cache", so you could probably control that from the side rather than chaining the proxies/domains.


Another point that's missing. If the link shortener goes out of business, your link is unreachable or you have to change them all.


Oh absolutely. So many dead links scattered around the net. It's gotten so bad an independent group Archive Team are brute forcing URL shortners so these links aren't lost to time. Just look at how long the list of dead shorteners is https://wiki.archiveteam.org/index.php/URLTeam#Dead_or_Broke...

72,287,136,510 links scanned, 14,632,045,317 shortened URLs archived. If you are interested it's easy to run their docker image to help with this and other archival projects.


As someone used to shorten his links a lot, that was my biggest concern. As an avid blog reader tho i tried avoid short links as much as i could, although very often to no avail.


This was a talking point when Libya descended into civil war. What happens to .ly TLD?

https://www.outsidethebeltway.com/libya-the-internet-and-bit...


> My link shortening tool provides me with analytics

I run a link shortener site, and use it privately and don't publicly expose the API.

One thing I noticed regarding analytics, is that the click count is always skewed. When I post a shortened URL on Twitter, within seconds the click count is always `>10` views. After further investigation, it seems there are automated bots that scoop up URLs the very second they are posted.

Also Twitter runs little microbrowsers that scan the page for metadata which helps them create a 'preview' of the link.

After looking at the useragents of some requests I'm seeing generic Firefox UAs which I can only assume are random surveillants (not bots) who habitually scan Twitter for interesting or anomalous content. We truly do live in a world where nothing is left `unseen` (by bots or actual humans).


These days I never click shortened links without first verifying where it will take me. There is so much malware out there and browsers are nightmarishly insecure that a single link click could result in a getting completely pwned.

Pro-tip: append the “+” character to any bitly link to show the target link without first visiting it.

Pro-tip2: consider browsing with JavaScript disabled by default. Enable it on a per-domain basis.


I still wouldn't trust the plus character to not fail or whatever one day. I manually expand each short URL I get using various webservices. I'm sure there's an extension for that. I would still just walk to the expander website and paste it in though.

You're right. Short URLs are Shite urls.


Another more technical option is to make the request using curl and printing out the “location” header. Can browser extensions make non-redirecting requests and inspect the return headers?


There are some browser extensions that do similar things, I'd be interested if there is one that is particularly effective and security focused.


As the article notes, we don't want to socialize people to click on just any link.

I never click on links unless I know where it is going to lead me. Shortened links are one example. Even with an accompanying description, it raises red flags. Links to reputable image or video sharing sites without an accompanying description, is another example since you never know what is going to be on the other end.


Additionally you can add [1]"Actually Legitimate URL Shortener Tool" filter list on uBlock Origin, which is recommended by [2]gorhill

[1] https://github.com/DandelionSprout/adfilt/discussions/163

[2] https://old.reddit.com/r/uBlockOrigin/comments/m5iecq/how_do...

Description: In a world dominated by bit.ly, ad.fly, and several thousand other malware cover-up tools, this list reduces the length of URLs in a much more legitimate and transparent manner. Essentially, it automatically removes unnecessary $/& values from the URLs, making them easier to copy from the URL bar and pasting elsewhere as links. Enjoy.


I didn't know about the bitly + trick, thanks.


Works on TinyURL and others too.


I'm impressed by the level of involvement in this kind of stuff by the UK government.


The 'Government Digital Service' (GDS, sort of 'tech company for the civil service') continually impresses me.

I've no idea who (even which party) initiated it, but it was just sort of suddenly awesome. Or maybe it just evolved rapidly under great (civil) leadership and was 'always' there just not so great.

Blog has lots of good stuff too, especially articles on accessibility often do well on HN, since that's something they 'obviously' need to worry about, and actually do & do it well.

Often in comments here too (Robin something is a username I recall) - not the sort of crusty 'what's an HN?', 'you can't do that because the Oracle database on our IBM mainframe doesn't support it' department I might've formerly imagined at all.


Yeah I'm a long time admirer.

Sadly they are encountering resistance though. Some departments would rather spend £X bn contracts with HP, Fujitsu etc. so they can retain more control.

Also they used to publish amazing service status dashboards, showing how many transactions were published, error / success rates, etc. for every digital service.

Apparently these were all killed off recently with no replacement, and no good reason given.


  >> 'you can't do that because the Oracle database on our IBM mainframe doesn't support it' 
Oh gawd that's literally my day job... :O


Fortunately I rarely see Oracle on IBM z.

DB2 on IBM z is quite well capable, I'd just like for there to be less artificial barriers between teams involved in the places I encountered it :/

(Funnily enough some of it could be blamed at bright-eyed "modernizers")


credit where credit's due, it was the tories in 2011. Cameron's own initiative, rumour has it.


To the extent you can credit a single person, it's probably Martha Lane Fox, but yes David Cameron gave it a lot of political support.

Directgov 2010 and Beyond: Revolution not Evolution - https://www.gov.uk/government/publications/directgov-2010-an...


If that's the US Dept of Digital Services you're talking about, I think it was created by the Obama administration to resolve the rollout of the Affordable Care Act site where people can sign up for various plans. They see to do some cool stuff


The GDS is the model the US dept was modeled after [0].

[0] https://en.wikipedia.org/wiki/Government_Digital_Service


No, like more people than you might think, I am a non-American with internet access.

(And commenting on an article hosted at civilservice.gov.uk no less.)


The comment is referring to the predecessor in the UK, the GDS.


Sort of; the actual founding of USDS happened after the healthcare.gov recovery, but was directly inspired by that and included many of the same people, including the first administrator, Mikey Dickerson.


> Combining the information you get safely and securely from things like Twitter Analytics or Instagram Insights with your Google Analytics helps tell you even more about how your content is performing.

Google Analytics and similar are blocked by a large (and increasing) number of visitors. To my estimates, about 40-80% of a website's visitor will not be counted in Google Analytics (depending on website and audience). Some browsers now block those platforms without the need for any add-on too (like Edge in "Strict" privacy mode).

In short, GA is useless or soon will be.


I have a new project site that's largely viewed by technically competent people. All my other logs indicate that I get a mere 50 to 75 unique visitors per day - not heavily trafficked. Google Analytics often counts about 10% of the visitors, which is easily confirmed by checking all the other metrics that are available to me.

So, yeah, I'm not sure how much longer they'll be a viable source of data.


Another problem with long lived short URLs is that the account used to generate it can be hijacked later and the short URL be pointed at a different destination with malware or other malicious intent at the end. I've seen this happen a lot in my time.


I appreciate the core message, but it's quite disappointing to see a government message exclusively talking about how Google Analytics (coupled with Twitter/FB Analytics) is the one solution. Especially as they problematize user privacy.

Given that this is a mainly message for those communicating on behalf of gov.uk, I think the best they could do is host a URL shortener for use by government communicators. It's also good advice for businesses.


Your concern is actually (partially) adressed:

> If you’re adding campaign URLs to offline materials – like posters or leaflets – and don’t want to feature a long web link, I’ve got good news for you too. GDS provides the option to you to request a shortened version of a full GOV.UK URL.

I'm disappointed that they mentioned Google Analytics. People willingly using Twitter (or Instagram) is a thing, involuntary Google tracking is another.


When the org pays for Google Analytics, Google does not share the tracking data with the rest of its business, so users' privacy is not harmed. GDS and many other UK government orgs do pay, for this reason.


I went to generate a QR code the other day for a URL, just went onto some random website from a quick Google search.

The generated QR code had the URL rewritten to a short URL, and buried in some small print was a limit to how many times the URL could be “scanned” before you have to pay.

I guess these sorts of sites _really_ count on people missing this and spending thousands on print before realising.


I use them with mass SMS, where we do have a character limit in our messaging tool. We can go above it, but then we get charged for multiple messages.

Custom domain, of course, or the carriers wouldn't like it.


They deliberately hide payloads; they are not trustworthy.

Now that I'm thinking about it, I should add bitly and related to my DNS blackhole...


The article ironically links to LinkedIn explainer which states if the URL is > 26 characters it will be replaced with their URL, Not a hyperlink like Twitter of many other platforms which tells the reader where the link points after redirecting through the tracking URL.

IMO, This defeats the context of the article.


If you're faced with shortened URLs and want to see where they lead before you click on them, URL expanders can be useful.

DDG "url expander" returns a number of these. I've been relying on the first result, https://urlex.org/ , for some months now, particularly as my router/firewall blocks most actual shorteners as spam vectors.

Note that if the shortened URL contains any specific private information, or would identify you specifically, you're still facing a risk. For shortened URLs found "in the wild", they're a useful tool.


Perhaps a use for blockchain technology - persistent storage of shortened URLs.


A massively over-engineered solution for the completely made-up problem of not being able to use the original URL. Yes, that’s perfect for blockchain technology!


Hey, we might just not fully trust our friends at archive.org to run an uncompromised database, and wish to trust 51% of a network instead. From that point of view we can point to a real problem and merely have it massively over-engineered!


I don't understand, you're not archiving the website, just the shortened link to the website, right? So if the original website (or Archive.org, or w/e) is compromised, aren't you screwed no matter what?



Feels like a bad idea. Shortened links often don't need the level of longevity blockchains provide, nor will they be able to afford the cost if decentralized storage with high availability.


It's probably a bad idea, but it might moon anyway and you might get rich creating a coin for it. The reality of money today I suppose


This is a joke, right?


Definitely, but some people are deep enough into the PoW AI that the Cloud is too thick to C# through and many of the rest has knee-jerk reactions in the opposite direction.


I find it strange that an offical government body is posting a blog promoting the use of Google Analytics over link shortening tools for click tracking analytics purposes.

I understand the argument to use a secure, formatted, link shortened. I don’t understand the argument to use Google Analytics (which captures way more data than I need in most cases) as the full and final solution to all of my analytics requirements.


i wrote a thing about this 12 years ago: http://joshua.schachter.org/2009/04/on-url-shorteners (hn discussion here: https://news.ycombinator.com/item?id=545565 )


I've really grown to hate link shorteners. They get used to obfuscate the real URL, so of course my pihole and other adblocking software blocks them. But even the local gov't insists on using shorteners in the links they put in e-mail. Instead of just making the URL of the website sane. So I have to jump through hoops just to get to the site they link to.


I run my own using [YOURLS](https://yourls.org). It addresses the issues brought up in the article:

Control your links, override slug names so they are readable, maintain private analytics, keep branded by running on your own domain.

It’s easy to set up and maintained by many of the people working on WordPress core. I recommend it.


There are good uses, but with the exception of doi (and apparently gov.uk's own), official documents are not one of them.


We are also looking at DOI for UK gov docs & data to make them easier to cite.

You can give us feedback at https://github.com/alphagov/open-standards/issues/75


you should specify to what country the "government" refers to. Do you propose a generic thing for governments around the world, or is it specific to a particular one?


Thanks - I've updated it.


A huge under-appreciated reason (at least when talking about SMS): many major SMS carriers will block SMS messages containing links from popular URL shorteners because these are widely associated with spam and fraud. The same is true to a lesser extent when sending emails.


Is there a tool for anonymously getting https://goo.gl forwarding url? Will be very useful, 'cause this service is popular.

P.S. Service discontinued, but a lot of links are available. Bitly and Ow.ly support will be cool too.


I believe API's exist. You can follow redirects in most webclients pretty easily, or there's "redirect detectives" online: https://www.redirecttracker.com/


The article's a bit odd. Half of the advice contradicts the other half.

You don't need to link shorten, because social media does this already. But also, shortened links are bad and unprofessional.

Don't worry - GA will do analytics. But also, watch out for privacy.


I generally use(d) link shorteners on messengers like Facebook just so that the link doesn't take all the conversation space and the previous lines can still be visible.


Why does the in-house URL shortener require 2 weeks notice and masses of paper work... Just throw together a gov.uk shortener tool...


I assume the shorturl would be named (hypothetically gov.uk/ucas -> gov.uk/university-clearing-through-ucas). Fully established tech companies has a similar process for requesting short links, for instance Google has an internal form to request g.co/ short links.


I do not believe that it takes 2 weeks to get one at Google?


UK Government URLs are (practically) forever.


Probably because they want to be in the loop.


I guess this is what they refer to as red tape?

> `How to request a short URL*

> Submit a new feature request using the support form.

> You’ll need to tell us:

> - the reason you need a short URL

> - the content or page the short URL will link to

> - how your short URL will be used in marketing and promotion

> - the channels you will be using, the number of users who will be targeted

> - what the main message will be in your marketing and communications

> - how many government departments or organisations will promote the short URL


I think that there's should be an automatic tool for anything under .gov.uk and .nhs.uk, and then manual process for other links.


I get the impression that they want to ensure people choose meaningful short(ish) URLs, rather than getting random alphanumeric suffixes, because they are invested in the trust placed in the gov.uk domain and it not looking like phishing bullshit. So it makes sense to me that they want to curate the namespace rather than making it either self-service or fully automated.


Yeah absolutely but you can write a tool that does that and doesn't require 2 weeks wait time... Its just a crazy example of gov bureaucracy!


What about link largeners? How secure is urldefense really when everyone runs on it?


Ironically, I got a letter from the NHS about getting my COVID-19 vaccination, and it included a bit.ly link to some official NHS guidelines document.


A letter can actually benefit from easier to type links, though. There's at least a point to it there. Too bad that bitly links contain more entropy than the password of the person typing it and doesn't avoid similar characters..... they could have chosen a service that actually optimizes for copying from paper instead of highest entropy per character.


I don't think I'll be taking any advice from that surveillance state, thank you. It's like Darth Vader giving skincare tips.


As odd as this comparison is, it isn't even advice meant for you. It's meant for other Government agencies.


Ironically I think he'd have some great advice, as does this article.


Ironically, the BBC (effectively part of the British civil service, not matter what they claim) uses a link shortener for their 7 Days News Quiz.


Meanwhile, domain registrars are still emailing customers asking them to "click this link" to verify their contact information. No URL shortening there, just a wildly irresponsible process.

Edit: I know it's required by ICANN (I read the emails) it's the "click here" action that bothers me and perpetuates dangerous behavior.


It's mandated by ICANN.


Does ICANN actually mandate that a 'click this link' be included in the email, or that an email is sent asking the user to verify data so that a 'please login to your account to verify' would suffice?


> Does ICANN actually mandate that a 'click this link' be included in the email

Probably. It's been a while since I worked in that industry, but ICANN has always been pretty picky about the exact contents of emails.

Changing the user workflow in the way you're describing is out of the question. The entire purpose of clicking a link in the email is to confirm that the email was received at the domain's WHOIS contact address. Allowing a user to log in and click "confirm" without clicking a link in the email wouldn't confirm that.


> Does ICANN actually mandate that a 'click this link' be included in the emai

This is my issue. I've spent years telling clients to not click any links in an email from your bank/insurance etc. but yet Network Solutions and others are still putting a big fat "Click here" button in the email.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: