Something I wrote to amuse the junior front end developers here:
document.write('Error: Script not found.');
var node = document.currentScript;
if (node.parentNode) { node.parentNode.removeChild(node); }
Pop that in a JS file called something like jQuery.min.js and add it to an HTML page with the usual <script src="/js/jQuery.min.js"></script>. It'll run when the page loads, add the line of text to the page, and then it'll remove it's own <script> tag so there's no reference to it in the DOM (in relatively modern browsers) if you view the source. It's easy to debug by watching the network traffic, but it caused a few scratched heads for a little while.
'View source' shows the HTML source, not the DOM, in both Firefox and Chrome. Maybe you're thinking of the developer console, or the Firefox 'View selection source' feature?
I gave a talk at BlackHat many years ago about JS malware, and proposed obfuscating malicious JS like this:
- Treat JS code like 7-bit ASCII
- For each character, convert the bits into white space. 1= space, 0 = tab
- A = "1000001" = space tab tab tab tab tab space
- concat it all together, \n shows you are done
So you can represent JS code as just whitespace. Which means this is malicious code:
<script>
//st4rt
//3nd
var html = document.body.innerHTML;
var start = html.indexOf("//st" + "4rt");
var end = html.indexOf("3" + "nd");
var code = html.substring(start+12, end);
eval(hydrate(code));
</script>
I like that a lot. I wonder if it might be possible to use Unicode zero width space and zero width non-joiner characters.. Then there wouldn't even be any white space to see.
I would suggest before you leave your job, you say "it was nice working with you" to the people you liked working with, and absolutely nothing to those who you did not enjoy working with.
The tech world is big in some ways, but also equally small in others. A select few might find this funny, but others will not appreciate their day (or longer) spent debugging your practical joke...and on the chance you actually get something like this onto production, well now it won't just be your developer buddies you got off side.
(author of the gist...) Anyone really thinks it's not a joke? This was all a few years ago, but IIRC the original thought behind it was something I saw on twitter, or perhaps just pondering how evil can C/C++ preprocessor could possibly be.
I don't think anyone is seriously advocating people do these to company assets. This is just stuff that would be funny in a hypothetical scenario rather than a suggestion.
And if you do a code change, make sure it is really funny, is easy to fix and discover and is invisible to customers. Like, inject a hilarious joke into the log files or something.
A coworker at a previous job altered the company's internal web application so that when one specific user was logged in, about 1/20 of the time it would load a hidden iframe that played Rebecca Black's Friday. This stayed in place for a long time because the user couldn't figure out what was going on and was too embarrassed to ask anyone else about it. Instead, they turned off the computer's sound and started listening to music on their phone.
I did have a friend that wrote a feature that would make ghosts flash across the screen. Long enough to see, but fast enough to make it hard to pinpoint for sure what you saw. It was locked to only happen when the "boss" used the app.
Kinda funny, no customer impact. Not sure what would have happened if he had a bug and made ghosts appear for everyone...
Right before leaving my last job, I updated an internal web-based tool to add a menu item called "Add Unicorns." It just used cornify to show unicorn images when clicked. But I also made it only show that menu item at random and somewhat rarely, so I heard that it took some people a while to notice, which was fun. To my knowledge, that change is still there!
A coworker of mine, definitely not me, left an html comment in a template for a page on a fairly big website that read similar to "Help I'm trapped in a website factory". So anyone perusing the generated html might read that. Similar to the xkcd pi joke: https://xkcd.com/10/
The horror story that I heard was a disgruntled engineer silently replaced the source codes (C++ based) in the project with compiled binary object files and he kept the source codes on his local computer, not checking those in. He did this over an extended period of time to make sure this crept into the backup tapes as well. No one found out because each engineer owned a code module of their own. Then he resigned.
When his successor tried to debug and enhance the code base, the core files were basically all stripped binary object files...
Some people who aren't critical to the company just have to go and try to prove the company wrong.
This is so childish and stupid it aggravates me. It only proves that the engineer was probably not a valuable asset and he really proved the company point with these actions. Hopefully he was on a performance plan or something similar.
At companies where there have been poor code control practices I have maintained git repositories locally of various files in the system to avoid exactly this thing, and to find issues/when things have changed (this too often is because operations teams don't like to maintain their files properly, so I go out to web servers and pull down configuration files on a daily basis and check them in somewhere. Now I'm telling them when their files changed).
Regardless, I have to assume this is before git/svn/mercurial. At least I hope it is.
Not sure how viable to check file changes regularly since 1) everybody had their plates full 2) the system was complex with a lot of black magic that 'just worked', thousands of source files, within the mix were compiled binaries (mainly 3rd party hardware drivers) and a lot of libraries (Qt, Boost, etc.)
A friend of mine did something similar, not that long ago.
His employer told him to complete a 1-year masters in computing at his expense, including a course on ethics, or see them in court. He chose the degree course.
That's almost benevolent on the part of the employer -- having more degrees can only be good for the person's employability, and it was most likely an enriching experience to boot. Oh well, i guess some people have larger hearts than i do! :)
Is that illegal though? Sure, if you can PROVE it is done on purpose then maybe, but assuming you cannot then is it? Because if it is then any misconfigured version control or any employee that doesn't do what is expected of them is also breaking the law.
Is it worth the risk to find out? Court ain't cheap, even if you win.
As an employee of a company that provides you a paycheck, you "owe" them your best effort. If you don't want to try, quit - but don't sabotage. That is juvenile and perhaps illegal and certainly unethical.
> Yeah, and the company "owes" you as high a salary as they can possibly afford...
When a company makes a job offer, you agree on the salary. For X dollars, you agree to be their employee and do your job. Your job is not to sabotage a project or commit binaries where people should commit source code...
Pretty sure if you had employees you would not love it if they did that.
Seems pretty straightforward to me: the intellectual property of the code this employee was developing lies with the company (per default).
Either he still has the code, in which case he's supposed to hand it over.
Or he deliberately destroyed it, which means destruction of company property. Deliberate? Yes, because a programmer claiming "oh didn't realize you wanted to keep the source codes!" is not going to fly very far in court.
(BTW I'm modelling this on my assumptions about how this would play in Dutch court, which can be delightfully pragmatic. So there might be some differences how this would work in the USA, such as others commented, ability to afford justice in the first place)
Well, if the troublemaker is allowed to show the code from his own laptop, you might not see the problem.
Normally, when we do code reviews, I just ask for the repository location and branch or tag name. I check it out myself to review before we meet as a group.
I honestly believe that one of the CTOs at my previous employer has followed that guide to ensure the security of his job.
He wrote an entire custom framework for their SaaS platform. I couldn't believe my eyes when I started work on it. I think I lasted 4 weeks before I gave them my 2 weeks notice.
I believe random ones based on compilation date can be evil, since git bisect can't find them. What about an error that only happens depending on character encoding?
PCI-compliant networks often contain checks for credit card numbers being sent in plaintext over the network. Problem with that is that credit card numbers are computed according to a formula, and it's really easy to generate a bunch of fake 16-digit numbers that will pass the check. So if you want to troll your security team, generate a CSV with a stack of credit card numbers and drop it in a few places on a server. Even better, set up a script to send it over the network somewhere. Then wait for the scan.
It is always a funny joke to say "commit this when you leave a job". But I always wondered if there are people that actually do this.
Although it could be funny and give a sense of revenge for some wrong (perceived or real) that the person leaving might have suffered, I don't think this would be a good idea. Contracts usually include liability for gross negligence or wilful misconduct.
Does anybody have a record of this actually happening at any company?
The problem with doing this, besides being incredibly unprofessional, is that it won't really affect the company much - besides reducing productivity for a little while - however it will annoy the hell out of your previous colleagues. Unless you're leaving because you fell out with your colleagues, I doubt it would have the desired effect.
Yeah everyone would notice it around the same time, look at recent commits, see some silly file got committed and revert. You'd have to have a team of pretty sub-par devs to not undo this fairly quickly. Although it would be sort of annoying for a couple of hours. In fact if you left on really good terms and were still great friends with the team, something like this might be more of a silly prank than anything truly malicious.
Obviously most of those fall easily under gross negligence.
However, I threatened to use this for the company I quit in 2010 and the threat was enough. We were negotiating and they thought a share of 100-0 in favor of the employer was ok. The law was 100 in favor of the employee. After threat we ended up at 50-50.
From the Computer Fraud and Abuse Act (18 U.S. Code § 1030(5)(A)):
>knowingly causes the transmission of a program, information, code, or command, and as a result of such conduct, intentionally causes damage without authorization, to a protected computer
English isn't my first language, and I don't live in America so it's not like this matters to me, but now I'm curious. How is editing a source file you're meant to be editing "transmitting" anything?
Transmission is probably loosely defined as a means of getting code to the target system. It would likely include physically typing into the source file, uploading your own file through CLI or GUI, or pushing to a repository.
The CFAA intentionally used broad wording, so it could be used in many different circumstances as possible. Here "transmitting" would be when the employee (who certainly doesn't have authorization to intentionally break things, mens rea) transmitted the file to the source control server.
The key is "without authorization". You're allowed to fix things and add features, but deliberate sabotage is arguably unauthorized. You might win in court, but as others said elsewhere, court is Too Expensive even if you win.
That, and I wouldn't be surprised if they didn't go after you for industrial espionage or something.
Except that you're entirely authorized to access the computer system.
Purpose or intent isn't defined in that particular law.
Do you have authorized access? Well, as an employee you do.
This isn't a criminal matter, it's a civil one, and no company is going to sue a saboteur unless they need an example made; they stand to gain nothing.
A company doesn't get to retroactively redefine what "authorized access" is as it suits them.
You have authorized access, but the statute says 'cause unauthorized damage' (which in its most literal reading could be an unintentional bug, too - don't think for a moment I'm supportive of this).
I left a file called xmas.js included in an internal tool one time when I left for a 9 month trip in between contracts (in November).
Basically it would check whether it was the last few weeks of December, and whether rand()%20 was zero. If so, it would wait about a minute then slowly fly a little gif of Santa & his sleigh across the background, behind all the controls on whatever form it happened to land on.
They had a team of data entry guys using this tool, and it would take on average a few minutes to enter each record. So it made its way through QA and eventually to the desk of a friend. Got an email on the beach about it. Fun times.
But also a great war story for the person who discovers it later.
"So it was my 5th sleepless night. The thing would work 99% of the time. I triple checked every single line of code and it was still formatting the hard drive from time to time. Then I discovered:
#define if(x) if ((x) && (rand() < RAND_MAX * 0.99))
A more unpleasant variant of this would be to combine `#define i j` with a language like Fortran 77 where variables are implicitly typed by letter. e.g. `#define i x` would change the implicit type from int to float...
I watched my friend swap the 'm' and 'n' keys around on the two tech directors keyboards the evening he left after goodbye drinks, the next day they both had to contact the IT support department as it turns out they still look at their keyboard whilst tapping in their passwords... amazing!
I heard about a bug report once, where a guy contacted IT to say "I can't log in when standing up, only when sitting down."
IT guy is like "well that's the stupidest thing I ever heard", but he tromps up to the reporter's cubicle, and sure enough, same thing happens to him.
Eventually he discovers you can only log in if you're touch-typing, because a few letters got swapped, and nobody touch-types when they're standing up.
I think everyone knowing everyone else's salary would cause major problems, at least here in the US. It would be a major source of headache for management as they try to triage complaints from employees who are getting paid less than their colleagues with the same title or similar experience (or even less experience).
Back in the bad old days of Visual SourceSafe I believe it was possible to perform a "commit time bomb" by rolling your computer's clock forward a couple months before committing. The VSS backend would not enact the commit until the server's clock caught up to the commit's timestamp. D:
We've got a guy in the office that merges the past over the present all the time. He's not quitting but I imagine if you were to try to break things this would be a good way to do it.
Got a offshore guy just like that. Constantly committing his whole project where most files are out of date and only the few he worked on are not.
There's no reason for him to commit javascript files he doesn't even work on the front end. It took me forever to figure out he wrote over my files the other day.
Sounds to me like he may not actually understand how git works, or he's following bizarre conventions that I've seen before from overseas devs I've worked with to just overwrite the entire project every commit
What they're doing is something like branching off a very old version of master, making changes to file A then trying to merge into the latest version of master, which already has a load of changes to file A. They then resolve the conflict by picking "ours", thus ignoring all changes on origin/master in favour of their own old version of master. This doesn't require a --force.
Sounds like one of those new git conventions that becomes hot every little while. Scripts are written that automate a 10 line git process which along the way gets mangled or copy-pasted wrong and you end up there.
DVCS systems like Git and Mercurial will require you to merge, but if you use a GUI VCS frontend it's really easy to choose the "use my local stuff" button. Been there with some university projects groups. (These people also liked the "force push" method of merging.)
Wouldn't it be wise if the compiler (or preprocessor) issued at least a warning if you redefine language keywords ? :)
So I just pasted this into a C++ file I was working on and it compiled without a single warning:
#define struct union
#define if while
#define else
#define break
#define double float
#define volatile // this one is cool
I mean, redefining language keywords is not a thing I do every day and I guess most of you don't do it either and I can't see a valid reason why you'd want to do it in a normal project.
For people who really want to do it, they'd just disable the warning.
My c++ isn't so good that I understand that first one, but if->while and break->"" can introduce infinite loops, else->"" will break logic (and possibly hit null pointers), double->float will cause subtle rounding errors in numeric computation, and volatile->"" will break multi-threaded apps unpredictably.
It's evil, subtle code breakage that because of the macro (in an included header far far away) leave the code looking perfectly ordinary.
So memset was crippled from Android 1.5-2.1? How did this not bring everything to a grinding halt? (Not familiar with Android's inner workings specifically, but from a C standpoint this sounds major.)
How often is memset used to set the data to something other than zero? Probably even less often than strncpy copies a string that is exactly the length of the destination, I'd wager :)
I had a professor back in the university that changes all the variable names to beer names while at a job. It ended up not being a problem for the company as he was consistent with the names.
Perhaps the most enlightening (and actually useful) purpose of this file is to dramatize the glaring weakness in the c/c++ macro system. A proper macro system would not make it so easy to do this, shall we say, "evil", stuff :)
PHP actually allows much less mischief here than C or, say, Python does. It doesn't let you redefine or delete functions, classes or constants, unlike C where macros can be abused for this, and Python where you can delete anything.
But you could hide all errors if you want to screw with people:
A thing you have to understand about PHP - at least, the php.net/Zend Engine PHP most people use - is that what is and isn't in its "standard library" isn't as well-defined as it is for, say, Python. At its most basic, PHP is just the /Zend directory of the source tree: a lexer, parser, compiler and interpreter for PHP code. It can run PHP code, but Zend alone can't do anything except maybe tell you how long a string is.
In order to actually do anything, you need functions and classes that interact with the outside world. And all of these, even the "standard" ones, are implemented as extensions. These are libraries that plug into the Zend engine and expose functions, classes and constants. You can enable them or disable them at compile-time. You can build them into the interpreter itself (static linking), or load them at runtime (dynamic linking).
PHP's source code repository, alongside the /Zend directory containing the PHP interpreter, also contains an /ext directory containing a bunch of different, useful extensions. Only a few of these are always compiled and cannot be disabled, such as /ext/standard, which includes a large number of functions dealing with things like file I/O, strings, array manipulation, password hashing, number conversion, and so on. There's also /ext/date, which manipulates dates. These are the only extensions guaranteed to be available in PHP. They're the most minimal definition of PHP's standard library.
But there's a lot of other stuff in /ext. There's JSON parsing (/ext/json), database connectivity (/ext/pdo, /ext/mysql, etc.), image drawing (/ext/gd) and arbitrary-precision arithmetic (/ext/gmp) among other things. These are all maintained by the core PHP maintainers alongside PHP versions.
However, alongside all of these, there's tons of community-maintained extensions in PECL. You can find all kinds of stuff in there, such as runkit, the extension you're talking about. Sometimes, extensions from PHP core move into PECL (usually dead ones), sometimes extensions from PECL move into PHP.
Anyway, presumably because there's no real difference between PECL and PHP-maintained extensions, both can be found in the manual. That's why runkit's there - it's not part of PHP proper, but it is on PECL. It might seem strange to group things into the manual that aren't officially maintained by PHP, but most of PHP's core extensions need separately installing anyway.
tl;dr: all the functions, classes and constants in PHP are defined by extensions, and the manual includes extensions that aren't part of PHP proper. PHP in most distributions only ships with a few of these enabled. runkit isn't part of PHP.
Basically this removes the volatile keyword from your code and replaces it with... nothing.
If a variable is declared volatile, it disables compiler optimizations and signals the compiler that this variable can be modified at any time (e.g. by hardware or other threads). Omitting volatile can lead to nasty concurrency bugs (e.g. if the optimizer optimizes spin locks away). In the worst case, such bugs are extremely hard to reproduce (and thus debug) but lead to deadlocks and/or crashes in case they do occur.
> Omitting volatile can lead to nasty concurrency bugs (e.g. if the optimizer optimizes spin locks away). In the worst case, such bugs are extremely hard to reproduce (and thus debug) but lead to deadlocks and/or crashes in case they do occur.
In C and C++, volatile is not intended and must not be used for synchronisation primitives, it is not a memory fence (so it does not force cache coherency and does not prevent operations reordering) and operations on volatile variables are not atomic. Its primary use case is memory-mapped IO (with a sub-use case of preventing eliding memory operations affected by inline assembly). If a lock is broken because `volatile` is disabled, it's probably incorrect in the first place.
All `volatile` does[0] is forbid elision of loads and stores.
[0] again in C or C++, Java and C# have completely different semantics
While you are absolutely correct, you will find no shortage of experienced employees at large companies that will disagree. In fact, they will insist that the only purpose of volatile is a primitive for synchronization. I actually had a conversation where a developer insisted that by declaring a variable volatile all operations on it were atomic.
Of course at this same company a developer insisted that if you use the mongoDB client libraries in your software, your software can never have data consistency problems.
Yeah C#'s volatile has much larger semantics. IIRC on top of preventing the elision of loads and stores on the volatile, it prevents reordering of memory accesses across accesses to the volatile[0] and disables thread-unsafe optimisations (e.g. thread-local caching).
That would also break the trick of declaring something `const volatile`. While that seems like a contradiction, I've heard of it being used to force the compiler to include a symbol in the final object file. In particular, I've seen it used to make a poor-man's plugin architecture.
It's not as crazy as you think. "volatile" means that something else other than us can change the value, somehow. "const" means that we aren't allowed to write to it.
So what is that? It's any kind of input. Like memory mapped GPIO.