Last night, ChrisC idly asked me why, on Twitter, hashtags are called hashtags.
Simple, I said, because they start with a #.
But, he said, they're called hashtags in the US, where the # is known as a pound sign.
I have a vague idea that # is sometimes called a pound sign; it's always struck me as a bit odd. I've always assumed it was related to the days when character sets were limited and it was used in place of £.
But of course they're hashtags. After all, they may call it a pound sign, but they don't pronounce it "pound".
But, said ChrisC, they do. In particular, in the US, C programmers talk about "pound defines".
This is just a bit of stray C syntax. Suppose you want your programme to limit the number of available heffalumps to 7, you can keep checking that:
heffalumps < 7
If you're worried that in the future you might want to allow more heffalumps you could do something like this:
#define MAX_HEFFALUMPS 7
and every time you want to check, you can just say:
heffalumps < MAX_HEFFALUMPS
Every time you write MAX_HEFFALUMPS a magical but dumb thing called the preprocessor will slavishly ensure that that gets treated as a 7. As computers improve and can fit more heffalumps in, you can just update it to:
#define MAX_HEFFALUMPS 24
instead of having to change it in lots of different places. This is commonly referred to as a "hash define". Lots of other instructions begin with the # character. See here for more detail than you can possibly want.
I'm sure at least someone will take serious issue with my AA Milne-based description of what the preprocessor does.
Pound defines?
Yes, he says. And pound includes. And pound ifs. And so on.
This is madness. Why wasn't I told? And can they be made to stop it?
And does anyone know why our American friends don't talk of poundtags?
Simple, I said, because they start with a #.
But, he said, they're called hashtags in the US, where the # is known as a pound sign.
I have a vague idea that # is sometimes called a pound sign; it's always struck me as a bit odd. I've always assumed it was related to the days when character sets were limited and it was used in place of £.
But of course they're hashtags. After all, they may call it a pound sign, but they don't pronounce it "pound".
But, said ChrisC, they do. In particular, in the US, C programmers talk about "pound defines".
This is just a bit of stray C syntax. Suppose you want your programme to limit the number of available heffalumps to 7, you can keep checking that:
heffalumps < 7
If you're worried that in the future you might want to allow more heffalumps you could do something like this:
#define MAX_HEFFALUMPS 7
and every time you want to check, you can just say:
heffalumps < MAX_HEFFALUMPS
Every time you write MAX_HEFFALUMPS a magical but dumb thing called the preprocessor will slavishly ensure that that gets treated as a 7. As computers improve and can fit more heffalumps in, you can just update it to:
#define MAX_HEFFALUMPS 24
instead of having to change it in lots of different places. This is commonly referred to as a "hash define". Lots of other instructions begin with the # character. See here for more detail than you can possibly want.
I'm sure at least someone will take serious issue with my AA Milne-based description of what the preprocessor does.
Pound defines?
Yes, he says. And pound includes. And pound ifs. And so on.
This is madness. Why wasn't I told? And can they be made to stop it?
And does anyone know why our American friends don't talk of poundtags?
no subject
Date: 2010-09-30 11:18 am (UTC)She doesn't do programming, so has no idea if there is any consistent convention in the US for its use in that context. She calls the Twitter tags hashtags, simply because that's what they're called.