Hacker News new | past | comments | ask | show | jobs | submit login
Tesla.com/.gitignore (tesla.com)
457 points by nateb2022 on Nov 25, 2022 | hide | past | favorite | 270 comments



So basically you run an endless script to fetch https://www.tesla.com/sites/default/settings.php and hope that some day there will be a minor nginx config error which lets you download the php source instead of executing it.

This will happen some day, so invest 5 bucks per month to exploit Tesla at a certain point, so maybe you can be first in line for the Cybertruck :-)


This seems to be a too sophisticated attack, sometimes simplicity is better: https://samcurry.net/cracking-my-windshield-and-earning-1000...


Time to try naming your tesla "drop table vehicles;"


Ah good old bobby tables :)


l


Great read


this was such a great read, people like you make me want to learn more and more everyday


Pretty sure every site on IPv4 gets probed multiple times a day for common config leaks and other misconfigurations. Happens to all of mine.


Yeah, but if a gitignore tells you where to look for, and it isn't even blocked by a WAF / rule, it makes an interesting target, esp. one of the largest companies out there.

You shouldn't even be able to execute settings.php


It's a good sign there might be an exploitable file upload vulnerability, if you can find an endpoint that uploads files to a directory that's served by Apache with the same configurarion as the directory of the executable settings.php


How is it a good sign of anything like that? File upload to disk is a completely unrelated concept that depends on how php is invoked by the web server.


Sure, I'm just saying it makes an executable file upload more likely. Because if a file like settings.php is executable by Apache, it implies that (at least in this directory) any .php file is executable by Apache, rather than a single whitelisted index.php or some wsgi setup.

So maybe the same configuration applies to a user upload directory. If you find a way to upload a .php file to a web directory on the same server, there is a possibility you can execute it - with higher success probability than if you did not know about settings.php being executable.


Finally, a compelling reason to use IPv6.


This comment transported me back to 2010 or thereabouts when this happened to Facebook. I remember being surprised at the simplicity of the code and making a lot of jokes about "build a facebook clone" ads on freelance websites.


I am sure there are lots of automated scripts doing precisely that with pretty much every company that has a website.

I used to keep a hall of shame on my main site, because looking for "settings.php" or "global.asa" on a Zope site was just silly.


Except that you'll find that error long before the cybertruck ships. Heck, you'll probably see the rebirth of NFTs and BTC over US$40000 before the cybertruck ships.


Interesting, the exclude file (actually, everything under .git/info) 403s, while .git/index is a 404.

- https://www.tesla.com/.git/info/exclude

- https://www.tesla.com/.git/index

README.txt 403s too. https://www.tesla.com/README.txt

edit: just going to add files I've found here:

- https://www.tesla.com/.editorconfig

- https://www.tesla.com/profiles/README.txt




Two space tabs, nice.


Add a trailing slash to index and it 403s


sigh


really? five down-votes because I sighed?

If you're going to down-vote me, down-vote me because I mentioned Elon is a human being, with human flaws and human strengths and not the resurrection of Supply-Side-Jesus.


A companies marketing website and their actual products have little in common. I would be surprised if any engineers even work on the marketing website and blown away if it is co-located with something sensitive.


https://xkcd.com/932/

Ffs, a tech forum should be better than this


No, it's a valid complaint - I've seen it at several companies that the development team were eager to present a professional website (so that anyone in the know looking would not find such embarrassing stuff and maybe scare off potential new hires or customers) but it ended up in the hands of the marketing department. To the degree that the infra was moved to a different domain so the wordpress install at "www.example.com" could never even remotely do anything with cookies at "example.net" - but yes, that might have been a tad paranoid ;)

I think the person you were replying to was not playing down the thing that happened, but explained exactly what the cartoon said. It not being important for the general public does mean it's not a probable fact.


I was agreeing with the person I replied to. Most of the rest of the thread is implying this affects self-driving somehow


I would judge a vendor or consulting firm based on their marketing website. Why wouldn't I judge a car maker?


If you think .gitignore leaks too much info, you're going to love https://www.tesla.com/robots.txt


The start/stop at the bottom makes that look like it's come canned with a CMS and they've just tacked on what they needed to. It's 90% boilerplate.


It's hardly a secret tesla.com is Drupal -- both that gitignore and the robots.txt shouts it quite loudly, to be fair. One of the larger Drupal agencies, Lullabot includes them in their clients list: https://www.lullabot.com/our-work and they are looking for a sr backend Drupal engineer https://www.tesla.com/careers/search/job/sr-software-enginee... which I would take if the company were not lead by Musk.


Not to mention a lot of the subsequent requests when loading https://www.tesla.com/ contains the HTTP header+value "x-generator: Drupal 9 (https://www.drupal.org)"

So yeah, not exactly a secret.


And for the lay man: https://builtwith.com/tesla.com .

Haven't seen Drupal in the wild for years. Good on them!


Probably you have, lots of websites still using Drupal, heavily customised of course. Search for "websites made with Drupal" and have your jaw dropped, as probably a website or two you visited recently will show up :)


Then you aren't visiting too many US government sites , most of the are on Drupal.



You can compare it to the current version of the same file in the most recent Drupal release https://github.com/drupal/drupal/blob/9.5.x/core/MAINTAINERS...


D10 will be out soon :)


Is this a normal Drupal practice? You just deploy the Git repo?


I think, generally speaking, it’s a PHP standard practice and more broadly a scripting language practice, though it doesn’t really apply to Node.

No pre-compiling is required, so you just ship the files. Especially true for anything that offers an Apache module (like mod_php).


Ship the files sure, ship the top-level folder not really. Most sites will have a "public" subfolder or equivalent, so the READMEs, scripts, sources etc don't get served. Either way, a professional would remove those files or block them at the HTTP server level.


Ehhh, I don’t know if I agree that most will have anything.


> more broadly a scripting language practice,

I can tell you it's not the case with Python.


Interesting, because that and mod_python were what I was thinking of aside from PHP. What’s the workflow there like?


Well, I mean, that might have been somewhat standard practice with mod_python the last time anyone used mod_python, which I would assume was about 15 years ago.

Looks like it's been dead since 2010: https://en.wikipedia.org/wiki/Mod_python

In practice, I think modern Python webapps usually use WSGI or similar, where you wouldn't be just dumping a bunch of files somewhere.


Modern PHP apps also would be using Nginx and phpfpm, but it ends up being the same - copying files over.


So the unusual thing about PHP (and, historically, mod_python, and CGI for that matter) is that it's normal to have the code actually under the web server's content directory tree. That is, if your content root is /var/www, then you put your code at /var/www/thing.php, so a deployment involves copying stuff into /var/www, and if the server is misconfigured, going to "https://example.com/thing.php" will actually show you the code.

For, say, Java or Ruby web apps, your code is more likely to live elsewhere (people love to fight over exactly _where_...), and run its own web server; nginx or apache or whatever will then proxy requests to that webserver. No matter how it's configured, you're never going to show the end-user the code, or extraneous files like .gitignore. Python's a bit of a corner-case (or at least it used to be last time I worked with Python webapps about a decade ago); it's customary to use WSGI or similar rather than a proper web server, but the effect is much the same.


That is very much not standard practice for PHP since about 10 years by now. Applications have a designated web root directory and an entry point that boots the application - as php is serverless by design - which is sometimes placed inside the web root by convention, but that is neither a requirement nor a security risk.

By now, stateful application servers are also powering modern php deployments: They also listen to a socket, and keep parts of the application in memory, next to an event loop.


Generally very much like Rails with a running process that could be standalone or with a web server interface module depending traffic or other requirements.


I don’t know, my servers are generally configured to serve nothing more than the index.php file and anything in the /public directory. I don’t serve up the entire content repository.


Do you deploy confidential information into the repo ? That would be the root problem.


Things don't have to be confidential to be an issue. Leaking the actual maintainer's names (as opposed to the Drupal list), for instance, would not necessarily be considered confidential, but still an issue if it showed up.


Usually passwords or keys are stored in a config file, and that is stored in a place outside the repo.


And the bumph at the top - crawlers run by Yahoo! and Google - lol


It’s the default drupal robots.txt it seems. https://api.drupal.org/api/drupal/robots.txt/5.x


If that's all the dirt that thousands of vengeful fired Twitter ex-employees could find, then Tesla must have excellent security.


Yeah this screams complete and utter desperation. Like, I get that hating Elon is what all the cool kids at school are doing this month but do we really need this immature garbage on the front page of HN all day?


Yep, it seems like most of the posters here in this thread don’t do much software engineering from the looks of it. Or are being purposely obtuse here. There is no security vulnerability here in any of the links we’ve seen so far minus some unnecessarily deployed boilerplate. The gitignore file is not the same file your deployment tool uses when publishing a website. If there’s an API endpoint that is public opposed to some static asset, that would be a problem. Nothing we’ve seen here indicates that.


Honestly, all the Twitter acquisition has shown is how irrelevant to Twitter’s success the management team was. Twitter has gone from a sophisticated, large organization with 8000 employees to 700 guys following the direction of a random guy making crucial business decisions off Twitter polls (lmao), and if anything it’s become slightly more popular and successful


"more successful" how? Because Elon said so? If it's just about raw page impressions/activity, then perhaps for while since most of the western press is reporting and often directly linking to Twitter right now but what will happen once the media and their audience is bored of the drama and jumps to the next fad?

Do you really believe Twitter will become more profitable under Musk than before when even the new CEO already prepped the workers for a possible bankruptcy, a fat pending debt repayment date coming closer and advertisers running away?


> if anything it’s become slightly more popular and successful

Twitter made $5bil in 2021. Do you really think this or the next quarter, post-Musk acquisition, post-him running off big name advertisers, will even approach any of the worst quarters from the last 3 or 4 years under previous management?

He has all the data. We know for certain Musk would be shouting from the rooftops if that brief burst of Twitter Blue subs made any real dent in revenue.


More than that, Twitter not has an extra $12 billion in debt, and $1b in interest payments due, which almost exceeds their quarterly revenue.


Well, I'd personally at least find some hilarity in being a Twitter engineer fired by one of those 10x Tesla engineers while they're publishing their .gitignore files via HTTPS (which probably means that their Nginx configuration is fucked).


This is not an issue and just means that their wwwroot probably comes from a repo. Anyone who judges an engineer who made this decision poorly is silly.

I’d say it’s closer to good thing than bad thing due to simplicity.


Not parent you are answering to, and I don't have a dog in this "elon is a god"/"elon is the devil" fight, but let's stay factual: while the .gitignore is not an issue at all, serving dot files should virtually never be done


>I’d say it’s closer to good thing than bad thing due to simplicity.

Unless they intended to publish their .gitignore, I'd say it's closer to a bad thing than to a good thing to have random files from your repository open to the public.

The simplest S3 permissions is to allow "*" publically too, but simple doesn't make it better.


It's barely a vulnerability. Many open source projects have theirs public. It might be a problem if the company's system was terrible and relied on security through obscurity; but maybe they don't care. The engineers who think it's a big deal may have tunnel vision. That can happen if you spend years in a very narrow area.


It's standard practice not to serve any hidden files (starting with .) over HTTP. The fact that .gitignore is served can indicate they don't block .paths, so lots of other things could slip through (.aws for instance).


Is that a standard now? Who's going to tell the guys using .well-known?


It has always been standard, it was the #1 thing to do when setting up Apache back when Apache was the standard and nginx was still this obscure Russian porn web server.

.well-known is much more recent and an exception. Can you think of any other .file or .folder which is wise to be exposed publicly?


I was around back then and uploading websites, (version controlling on svn, not git), and I do not recall it being a standard. The closest standard I can think of is .htaccess files (which we did upload) for various vhost specific settings.

What is your basis for this standard? Was there a mailing list agreement I missed?


That is an apacheism to avoid serving .htaccess which can include hashed passwords. It's not a general thing.


.plan


Mastodon actually uses .well-known for Webfinger stuf.


Are you sure it isn't .ht* that's blocked? That's what the default config is on my system.


https://xkcd.com/932/

I look forward to meeting the Tesla engineers who work on their core tech and also their webpage.


People are just having some fun.


This looks like a default file from a Drupal installation: https://api.drupal.org/api/drupal/robots.txt/7.x


Really doesn't leak much, and robot.txt is supposed to be accessible from the internet.


Yes, it's meant to be public, but you need not disclose all of what is contained inside of it. I've been on many pentests where paths provided by robots.txt, that I wouldn't have obtained any other way, led to exploitable vulnerabilities.

For some reason, a considerable number of people don't seem to think twice about adding sensitive paths to robots.


Robots.txt is a web standard, if it lists routes to actual sensitive data then hosing those sensitive paths is the issue, not robots.txt.

I regularly see bad pentesters fall for this.


that's defense in depth, right ? /s

also sometimes what's in robots.txt becomes invisible to the corporation as well and abviously bugs creep in


I would rather that the paths be secure themselves. Security by obscurity is not a good idea. Anyways there are not that combinations of paths even when you consider all the different cms defaults


You're correct that the resources themselves should be secured and that security through obscurity is a bad practice (and an oxymoron, as obscurity doesn't actually provide security).

That said, avoiding security through obscurity doesn't preclude you from giving away less information than is being given away here, nor does it make the act of removing that information entirely pointless. While this isn't the only way that the Drupal version can be identified, it is one, and there's no guarantee your adversary will find it via other avenues. Also keep in mind that with absolutely nothing changing on Tesla's end, this may go from secure to vulnerable, should, for instance, a remotely exploitable vulnerability in the running version of Drupal be discovered and published in the future.


Not the case here tho is it


Well, we don't really know. Maybe there's some easy-to-guess text file in /misc/ that contains a password for something. We don't know what we don't know. We do know that there's considerably more information exposed here than zero - the question is whether any of that information could lead to sensitive information, not whether or not it constitutes sensitive information by itself.


How does someone on pentests not know it's the default robots.txt that comes with Drupal and hence does not leak anything except that it's Drupal?


Comparing it to Drupal's default robots.txt


Did an inventory based on my crawler data a while back.

Relatively common to find sensitive or embarassing links singled out in robots.txt

Especially in old large organizations, like universities.


Apparently Tesla is FOSS, see https://www.Tesla.com.


Where can I get the FSD (Fake Self Driving) source code?


edited to hide my horrific lack of HN text formatting skills


you forgot to

    from autopilot import *


What makes it fake? Just today my car drove me from my house to the grocery store with no intervention.


Is that a route that you do often and it happened to have no unpredictable events today?


Cool, meanwhile my car feels like it's an unstable toddler whenever FSD has to turn. It feels like if I don't intervene, I'll crash.

It's far from "full" self driving.


Its just random cms bs. Nothing to hate elon about


If you think .gitignore leaks too much info, you're going to love https://www.tesla.com/robots.txt

I wonder if these are some of the same people that Musk brought in to refactor Twitter.


Imagine the guy at the helm here is now responsible for the most sensitive DMs of premiers and state leaders


Wow, top score for uniqueness, in the field of being stupid...


LOL, why, just wow.


I found a bug in the tesla model 3 reservation system that allowed anyone to get a reservation for free. Reported it via hackerone (or maybe it was bugcrowd dont remember) and got told it was of no consequence and would be filtered out later or something. Got no bounty for hours of work.

I accidentally ordered my model 3 with a free reservation, not the one I actually paid for.


Given that people are selling reservations for thousands of dollars, I think you deserved something for reporting the issue. But I suppose being a hardcore engineer means never having to say you're sorry.


So, should we just add .gitignore to .gitignore and problem solved ?


You're joking of course, but that likely won't do anything useful.

If it's tracked, then ignore has no effect. If it's not tracked, then you might as well use .git/info/excludes which is pretty much the same thing but not tracked, or you can use a global excludes file, like ~/.gitignore is common (you have to configure git to point at it, iirc).

It _could_ make sense to ignore the .gitignore if some other tool is parsing and using that file, but that pattern is...troublesome so I hope not.


~/.config/git/ignore


Hm, did not know that had a default, thanks.


the classic https://news.ycombinator.com/item?id=31420268

> Git ignores .gitignore with .gitignore in .gitignore


.gitignore to Dockerignore

(Partly joking)


No. You never checkout a site directly from git to begin with. You don't let other people know what files are ignored from git doesn't mean people cannot access them. :/


Nonsense.

Everyone uses git for source control, of course you check out a site with git.

All you are telling people with a .gitingore is what is _not_ available.

It means exactly that people can not access them if your site is a checkout, because they aren't there.


Many of us have a build process that converts the contents of a checkout into a deployable site (a.k.a. "build artifact").

The build process can trivially skip .gitignore files (and all other files that are strictly for dev environments).

You then deploy the build artifact to production, with exactly the set of files which ought to be there.


There's cases where you don't need a build process for a site.


Sure.

In those cases a build process is usually trivially easy to put together, and has benefits, so while not necessary it's still beneficial.


Whilst that's _sometimes_ true, there are plenty of cases where there's no additional value compared to just dragging the file into your FTP client.


Nope.

Paths in .gitignore means git ignores them. Doesn't mean the file doesn't exist. It means it's not in source control.

An example is a .env file. It may very well be _required_ in many PHP or node projects but it's going to be ignored.


It doesn't even mean it's not in source control. It just means that IF it's not in source control it won't be added to a change set automatically. gitignore has no effect on files which are already tracked, and even files which are not currently tracked can be explicitly added.


Great point!


I like the simplicity and pragmatism of using drupal. I wouldn’t work with it myself but it was probably the cheapest/fastest way to get a similar site up and running


If you stick completely within the Drupal"standard path", it's a great way to get a site up and running. Once you step outside of that path it's an absolute misery


Dunking on a tech while using a throwaway account and not providing details on why you find it absolute misery... not very useful or trustworthy.


I spent around 3 years working with Drupal 7 and and about a half a year with Drupal 8.

For D7:

* The frontend and backend are too tightly coupled.

* The views system was awful to design custom, complex queries for. Documentation was scarce

* No dependency management

* Lots of weird hacks to do standard things, like the Features module

* The hooks system can result in a lot of complex, unclear logic

I've since moved onto the python/js ecosystem and it's much easier to build sites in


Thanks. Too bad you didn't find Drupal worth hanging around for. Great you found a stack that makes you happier.


Indeed. I'd use Plone, but it's overkill for a website like this.


Can someone explain why this is leaky and how it can be exploited by malicious actors?


It's leaky because it's globally accessible and provides information that isn't otherwise readily apparent.

There is no guarantee that an exposed .gitignore (or other exposed files, like .htaccess, robots.txt, etc) will be exploitable, but they aid in the discovery process and may help adversaries uncover exploitable vulnerabilities they might have otherwise missed.

At the extreme, I've seen paths of backups of the production database listed in a publicly readable .gitignore, and that database backup was publicly accessible, too.

Most of the time, nothing sensitive is revealed, but defense in depth suggests it's better to not upload files like these to your web server unless they're being used by the webserver (like .htaccess) or crawlers (like robots.txt), and if you do, they ought to not be publicly readable (unless intended, like robots.txt), but even then, you'd want to make sure nothing sensitive is in any file like that which is publicly readable. Even if there's nothing sensitive in them now, there's no guarantee that nothing sensitive will ever be added.


I'm gonna give my counter take. Information disclosure is something that the DevSecOps(tm) crowd spends a disproportionate amount of time on for little benefit. The number of security professionals who don't know how to code, but learned Nessus or CrowdStrike and criticize others is too damn high.

I had to work with a security team in a FAANG for several years. They were so high and mighty with their low sev vulnerabilities, but they never improved security, and refused to acknowledge recommendations from the engineers working on systems that needed to be rearchitected due to a fundamental problems with networking, security boundaries, root of trust, etc. Unsurprisingly, their "automated scanner" failed to catch something a SRE would have spotted in 5 minutes, and the place got owned in a very public and humiliating way.

When I see things like this it brings back memories of that security culture. Frankly I think Infosec is deeply broken and gawking over a wild .gitignore is a perfect example of that.


I'm a professional red teamer at a FAANG company, for reference. There are plenty of times where I find several low severity vulnerabilities, none of which are exploitable alone, but which can be chained together to produce a functional exploit with real impact.

There's no guarantee any of your testers will find every issue, and there's no guarantee that a seemingly innocuous finding can't have a greater impact than might readily be apparent.

That said, there are a ton of charlatans in security exactly like you describe - folks who can't read code (let alone write it) who just know how to click "scan" on their GUI tools and export the report to a PDF. A lot orgs have a QA-level team running those automated scans, which get passed on to a penetration testing team, who have more experience, but a limited time window for testing, and then finally on to red teams, who, along with some appsec / product security folks who are embedded directly on product teams, tend to have the most expertise, and the most time to really dive deeply into a service or application.

Also, keep in mind that those gawking over this probably aren't security folks, and the competent security folks here may not be gawking at the file itself (or others) - just taking part in the discussion.


> There are plenty of times where I find several low severity vulnerabilities, none of which are exploitable alone, but which can be chained together to produce a functional exploit with real impact. There's no guarantee any of your testers will find every issue, and there's no guarantee that a seemingly innocuous finding can't have a greater impact than might readily be apparent

Absolutely this. Security often has a different perspective as to how systems may be exploited together to create a systemic issue where none exists independently. Of course, this is often also where security fails in communicating exactly why these low severity issues must be corrected and facilitating an engineering discussion as to how the attack chain can be effectively disrupted and detective controls implemented elsewhere in the chain such that attempts to exploit are detected.

In short, the failure isn't in finding the threat but in dictating solutions without getting everyone involved in engineering the interaction in the room, so to speak.

I would be ideal to have security engineering as an embedded function representing red and blue team findings as systems requirements and acting as a single point contact in regards to security issues such that mutual trust and respect may be developed.


I work in .gov so I have a lot of experience with that kind of security “engineer” but I’d take a more moderate position. This stuff is super-easy to resolve so you should spend a couple of minutes closing it and then focus on more complex things, with the reason being that when something like log4j happens you aren’t making it so easy for attackers to know whether you’re vulnerable – passively telling them makes it easier to avoid things like WAF blocking rules which will block IPs which actively prove.


There's no need to minimize or explode this; We need to put this into proportion. An information leak by itself is nothing, but it must be reported and taken seriously (by default, it should be fixed).

I'm not disappointed this happens at tesla.com; I expect as much. But to many people, this is a top-notch brand. You don't expect this on google.com or nsa.gov or fbi.gov either, do you?


Personally I'd not deploy these files. Although that would be more to do with not having to discuss it yet again with auditors or pentesters than it would for actual security.


It's not an arbitrary thing, and any kind of vulnerability (including this one) is potentially a step in a chained exploit. I wouldn't be suprised if we see a hack before Tesla fixes this. And yes, they will fix it because it's a security issue.


It's a bit of an information leak, but probably not a particularly serious one. It just gives some information about what tech stack they're using, which isn't really public but also not that hard to find out, and maybe a bit about where an attacker would want to look for other sensitive stuff. Pretty minor really, on its own.

It is a bit embarrassing because most web servers (and deployment setups) shouldn't be publishing/serving dot files anyway (files with names beginning with dot). But it's not necessarily a problem as long as they have some protection to avoid the _really_ sensitive stuff leaking, it's just kind of funny.


This shows that the teams in charge of code deployment have relatively weak quality control.

In practice, it means that if the gitignore file is leaked, that there is a substantial risk that they accidentally leak the .git folder someday.

The .git folder indirectly contains downloadable copies of the source-code of the website, which could very likely lead to credentials leak or compromised services.

Your life can depend on Tesla.com services.

Even if you are the pedestrian side.


What makes you think that there is some "substantial risk"? You seem to be mixing together git repos and site deployment rules. I don't see the big deal here with some CMS leftovers being deployed, but yes from a perspective of correctness this is not something that needs to be deployed.


> This shows that the teams in charge of website code deployment have relatively weak quality control.

FTFY. Little of Tesla's software is whatever they're using on the website. That'd be like judging Apple OS software by their website source.


This is customer control panel, which directly leads to car APIs behind that are using the same credentials.

On the same domain there is also the Tesla SSO.

It would be bad if this gets compromised as there would be direct impact in the physical world, not just a static landing somewhere.


So basically everyone’s life is at risk because the .gitignore got leaked. That sounds reasonable.


I'd be pretty surprised if the marketing / landing site was remotely connected to the user portal. Most companies have a marketing-friendly CMS for public content, disconnected from the actual customer-facing portal.


Tesla.com seems to be more than marketing, at least customers can sign-in there to do cars operations,.

If you can grab credentials from there you can do quite some things already.

See https://www.teslaapi.io/authentication/oauth (and this is in the case you don't trick an employee).

But I agree, that normally at some point they would catch it.


what makes you think the tesla.com website is where they keep their real code lol?


The gitignore explicitly called out where the sensitive settings file is, so presumably that makes it a lot easier to figure out where to start injecting bad code


Sure, but this appears like some very standard directories for popular website CMS platforms like Drupal.

So, not very surprising and probably doesn't really tip anyone towards anything particularly special.


It's probably caused by an incorrect nginx configuration, which means other static files may be exposed.

Otherwise, it's not much of a leak.


you could theoretically social engineer until you find something to exploit

ie, if the file said to ignore "/site/adminpasswords.txt" then you could go to /site/adminpasswords.txt and reveal admin passwords. this is obviously a simple eli5 explanation but i hope it helps

however, i doubt the tesla.com website is where they keep any important code that relates to actual tesla software like we would see used in cars. that would be like the army having their real code for their software/systems at goarmy.com lol


It's not really leaky and can't be exploited by anyone. It's an interesting curiosity at best.


Ask myself what other files will be exposed?



There is a `cron.php` lol


behind auth as of 4pm ET though


> sites//settings.php

Yes PHP is still relevant!


Yeah. WordPress, Drupal, Joomla, Laravel, vanilla php. Together they power almost 45-50% of the web. So PHP is still extremely relevant. The most relevant you might be able to say.


I just don't understand why Apache dropped native php module support forcing everyone to deal with finicky cgi.


Potentially also of interest is robots.txt. Who knew Tesla had ontologists?

https://www.tesla.com/robots.txt

Disallow: /taxonomy/term/*

404: https://www.tesla.com/taxonomy/term/


It looks like https://www.tesla.com/INSTALL.txt & https://www.tesla.com/README.txt exist, but aren't accessible.


Some Tesla holiday on-call devops or security person is probably getting paged over this right now.


Can’t access it…


    # Ignore configuration files that may contain sensitive information.
    sites/*/settings*.php
    
    # Ignore paths that contain user-generated content.
    sites/*/files
    sites/*/private


Archived copy for reference https://archive.ph/C6qJ4


I think a lot of people in here are overreacting a bit. This is an interesting curiosity that doesn't really have any baring on any of Tesla's internal software.



It's just their landing page, but still embarrassing nonetheless.


It’s not leaky at all.


“Hardcore Engineering”


I mean, in fairness, if you're not getting enough rest (which seems to be what "hardcore engineering" means) then maybe you're more likely to screw up the nginx config.


all the engineers that have not modified at least 50 lines in the .gitignore file in the last 60 days have been not terminated


A web page isn't "hardcore engineering".


Seems they took it offline. Any mirrors?


The Venn diagram of people who had an issue with the Tesla website yesterday, and people laughing at it being Drupal today does not intersect.


One of the best technology companies (let's assume it's) cannot maintain its site with modern technology. How can I trust them?


I do not trust Tesla because of the apparent instability of its owner, but not because its website does not use the most bleeding edge web technology. The website works and I see no information of any security flaws. This is what matters.


Does Elon personally drive every Tesla? Or do you think Tesla engineers will follow an insane directive knowing it would put peoples lives at risk and not push back or straight up walk off the job?


> do you think Tesla engineers will follow an insane directive knowing it would put peoples lives at risk and not push back

Have you seen what remains of Twitters engineers? All of them are fanboys that think Musk can do no wrong. Of course they’d do whatever the man says.

I have to imagine the same is true for Tesla.


> Or do you think Tesla engineers will follow an insane directive knowing it would put peoples lives at risk and not push back or straight up walk off the job?

By this logic, it's actually impossible for a company to ever do anything wrong.


They build cars and batteries, not websites.


Cars which are primarily controlled via software and interact with hundreds of web services.


TV makers make TVs which are controlled via software and interact with hundreds of web services. But they're often, well, _not great_ at the web services bit: https://www.theregister.com/2015/02/17/samsung_smart_tv_priv...


I am glad my TV will not accelerate uncontrollably out of my room if a web request times out.


I am sure the main corporate website isn't one of those.


Getting a 403 Forbidden error.


Huh... php


Getting 403 Forbidden now


universal galactic extreme programming requires it


They've got something a bit more fucked up than just an exposed .gitignore

    $ curl -si https://www.tesla.com/ | grep generator
    x-generator: Drupal 9 (https://www.drupal.org)

    $ curl -si https://www.tesla.com/authorize.php | grep generator
    x-generator: Drupal 7 (http://drupal.org)
So they have at least two versions running at the same time. The /authorize.php [1] uri also yields a 500 (instead of a 403 like most of the other resources), which implies Apache is most likely passing the request off to PHP and the script has a fatal or unhandled error.

The webroot appears to be a Drupal 7.x installation and Apache is serving that content directly (e.g. https://www.tesla.com/MAINTAINERS.txt same as [2]) and trying to run some of it (authorize.php), while happy-path requests are being reverse-proxied to a Drupal 9.x installation.

[1] https://github.com/drupal/drupal/blob/7.x/authorize.php

[2] https://github.com/drupal/drupal/blob/7.x/MAINTAINERS.txt


"Support migration from existing Drupal 7 to the new Drupal 9 site"

https://www.tesla.com/careers/search/job/sr-software-enginee...


polite chuckling


My knee-jerk reaction is that this looks like a marketing/eng split, or even just marketing/marketing. The main "corp" website of every org I've ever worked for is managed by marketing, not by engineering, and it usually shows in the quality. Usually drives someone in engineering (like me) slightly crazy, but honestly there are a million other larger fish driving me more crazy.

IME they're almost always completely separated from the "real" systems that engineers are working on / managing. A compromise wouldn't go far, in the backend. Something like XSS would be worse.

Always seems to come from some push to "running a website isn't our 'core focus' so we should vendor that" … or something. I've also encountered immense push-back on eng-managed corp websites: all those pesky best practices get in the way of just shoveling "content" (i.e., PR) out. And so it ends up separated from eng.


FWIW, a 500 doesn't imply the server is crashing. More likely just throwing a generic error, e.g. unexpected input –probably because it's expecting some form/data parameters– and failing the request early. It'd more correct to return a 400 in this case, but the /authorize.php endpoint may only be used by tesla.com frontend, so they don't care if it's used in unexpected ways.


What's the distinction between the server crashing and the server throwing an error?


Usually, a server throwing an error would mean that it is aware there was an unexpected state, and is itself consciously not fulfilling the request by returning a 500 error, for example. It remains available to handle the next incoming request.

A server crashing implies that the server program or process itself has terminated, and is not able to handle further requests. This usually manifests as a 503 error by an upstream proxy server (nginx/apache/CDN/etc.).


They likely have layer 7 load balancing sending different paths to different servers.


Guess Elon should go and reduce some Tesla services like he did with Twitter. Having different major versions of software running must take up a lot of maintenance...


Maybe he should bring in some Twitter developers to review the code at Tesla.


I usually don't engage in silly comments but this made me belly laugh loud, ta.


I believe this is the whole point of this submission.


Drupalgeddon 7 exploit. Infinitesimal chance it’s a vulnerable version. Unless we live in a sitcom simulation


Not to defend the Twitter situation, which is foolhardy by almost any measure, but it's extremely uncommon for any company's main landing page to relate in any way to their software engineering team.

Usually these marketing sites are running a CMS (this one looks like Drupal) which is owned and operated by either an internal team who report to the CIO / IT department (vs the Product/Engineering group) or a totally external third-party marketing firm.

As long as the "real" product uses different subdomains, certificates, proper HSTS, cross-origin protection, and secure cookies (a tall order, yes, but something that would be an issue no matter what the marketing site is doing), security issues in the "marketing" site aren't as bad. Of course a marketing site takeover is still worrying, as it's a prime entry point for spearphishing and horizontal movement through social engineering, but these usually aren't the same engineers or security team at all.


Nobody (sane) is saying this is a security vulnerability or the like (especially as it seems to be a default Drupal gitignore). It's just a funny mistake from a "software first" company.


There's several people in the comments saying exactly that kind of thing in this thread including people asking if it leads to vehicle code exploits.


I never claimed everybody on HN was sane /shrug


So what is gonna be your opinion when it gets fixed?


"It used to be a funny mistake by Tesla but now it's fixed"?

What are you expecting here?


100%. Tacking on to this, the last few places I've been the public facing site was usually managed by the marketing or design team, and often was used as a project to get them to touch _some_ code. Engineering rarely ever got involved with this if ever. If not them; probably a third-party firm.


I think this site's code repository needs to be reviewed. Maybe should call some twitter engineers


Check this: https://cdn-design.tesla.com/tds-fonts/

Saved version:

TypeError: Cannot read property '0' of null

    at forceFontAssetSource (/app/routes/middleware/moduleVersion.js:89:32)
    at Layer.handle [as handle_request] (/app/node_modules/@tesla/design-system-tools/node_modules/express/lib/router/layer.js:95:5)
    at trim_prefix (/app/node_modules/@tesla/design-system-tools/node_modules/express/lib/router/index.js:317:13)
    at /app/node_modules/@tesla/design-system-tools/node_modules/express/lib/router/index.js:284:7
    at Function.process_params (/app/node_modules/@tesla/design-system-tools/node_modules/express/lib/router/index.js:335:12)
    at next (/app/node_modules/@tesla/design-system-tools/node_modules/express/lib/router/index.js:275:10)
    at cors (/app/node_modules/cors/lib/index.js:188:7)
    at /app/node_modules/cors/lib/index.js:224:17
    at originCallback (/app/node_modules/cors/lib/index.js:214:15)
    at /app/node_modules/cors/lib/index.js:219:13


https://www.tesla.com/LICENSE.txt Tesla opensource confirmed?


So Tesla is free software: https://www.tesla.com/LICENSE.txt


Maybe that means they won't pull the same shenanigans that Mercedes did :)


Drupal is.


You all are cringe. Anyone working in tech knows that most marketing sites are made by third parties, likely some WordPress shop. The hatred for Elon on this site is ridiculous.


it's very weird. someone responded once that hacker news is a bunch of smart dorks that are basically jealous because they have not achieved anywhere near as much as elon has. i think there is some truth to this.


How many of us were born owning an emerald mine after all?


Maybe do some basic fact checking? Thank you for affirming the stereotype.

https://www.snopes.com/news/2022/11/17/elon-musk-emerald-min...

> Here's a summary of our findings: We located reporting from as far back as 2009 and 2014 that said when Elon Musk ("Elon" herefafter) was a child in South Africa in the 1980s, his father ("Errol" hereafter) at some point owned "a stake in an emerald mine" near Lake Tanganyika in Zambia, not South Africa. Beyond that, we were unable to find any evidence that showed money generated from his father's involvement in the mine helped Elon build his wealth in North America.

I'll add that it's also well known that while Elon's father was certainly well off compared to the average population in South Africa, they were not fabulously rich. They were also anti-apartheid. His father's personal income was primarily from being an engineer, not a mine owner.


seriously ? go drink a Guinness.


I'm more of a Red Breast kind of person.


also a great choice!


Yes. It is indeed a strong indication that they are secretly very jealous and don't want to admit it.

Hence why they are commenting and screaming everyday at Elon, letting him reign in their minds rent free; which I find quite hilarious on this site, Twitter and elsewhere.

I'm just laughing at the entire chaos being covered 24/7, and especially laughing at the angry 'dorks' attempting to escape it all; which they are unable to.


Here was me thinking we are just enjoying a real life soap opera with Musk being the nerd equivalent of Trump


Just flag bad posts. Meta-fuming only makes everything worse.


Yeah man it's wild. I think some people just need a "boogeyman" at all times to focus on, to help ignore their own faults. Elon is a bit of a bellend, but he's nowhere near the evil, cold blooded, psychopath that people on HN (and sites like The Verge) paint him out to be.


Make no mistake; This is a retribution attack in the culture wars.

This isn’t about Musk, and it’s not about Twitter-actual. It’s about Twitter-meaningful, which is to say a way since 2008 or so of controlling messages.


My dear American friends. What if this is a psyop from tesla marketing to get your attention?


“Never attribute to malice that which can be adequately explained by stupidity”


"Always design your malice to be adequately explainable as stupidity."


Very well could be a honeypot. Though neglect and accretion is more likely


Along with the acquisition of Twitter.


Good thing these are the people who helped fire Twitter's security team. Sure that's going to work out great.


At least https://www.tesla.com/.git/config is not accessible but still. This should never happen to a company that considers itself a software company first and a car company second.


One place I worked for exposed .git on a PHP site to the world. Infra was ho-hum about the report until they got a PoC which cloned the repo.


It’s ok. Elon code reviewed it.


this isn't reddit


It's rapidly becoming like it though. I don't know why, but the quality of comments is decreasing and more and more zero (see the comment you responded to) / negative value comments appear. The decline has become noticeable in the last ~half year.


And it isn’t clever there.

Elon makes a bunch of moves and posts that all revolve around pruning the company. And it’s become “omg he wants code printed, he’s so stoopid!”.

If I was going to prune a company, I would do the exact same thing to immediately weed out the people that came (figuratively) empty handed. They know they’re done and it would save me the time.


Yeah, some people just think parasites and sociopaths should be at the bottom of our society, not at the top... Perfectly valid viewpoint, it just happens to be the opposite of yours.


Drupal


Does this not come as a surprise to anyone?

I'd have figured that they would have rolled out their own custom headless CMS or something really complex. I mean, not that it doesn't make sense for them to use a bog-standard CMS tool, but my biases (halo effect?) would have made me think that they use something more more unique.


Drupal is pretty powerful and there is a large talent pool for it, so it can probably handle all their CMS needs just fine. And that would be smart not to roll their own.


Why over-engineer something that just doesn't bring much value to their company?


There was a posting just this week on a job site for a "Sr. Software Engineer, Backend Drupal" at Tesla. Putting together pieces like a leaked .gitignore file, job postings, etc. is social engineering in action.


That's not social engineering. You're not convincing anyone to do anything or share anything with you. This is OSINT.


The reason people think this is bad form is it indicates the site operators did something they did not need to do. It is an artifact of carelessness at best or misunderstanding of how their web server software works at worst. You do not need to serve a .gitignore file for a site to perform its basic function. But the obverse is also true. Serving .gitignore does not detract from the function of the site.

But among people who do this kind of thing for a living, there's a belief that every action you take (like copy a .gitignore file to the directory from which static files are served) should have an intent which can be traced to a specific requirement.

It's crazy to believe some product manager sat down and put "serve up a .gitignore file" in their PRD. Some people are therefore taking the existence of the .gitignore file in Tesla's public webspace to demonstrate a lack of care when it comes to matching requirements with behaviour.

But as people have pointed out, maybe this isn't a Tesla failing as much as it is a failing for one of their providers. And sure, on the list of failures, this is pretty minor. And if you can find a web host that ties behaviour to explicit requirements, I would LOVE to hear about it. Web hosting is a low margin business which doesn't pay premiums for detail oriented staff. To be sure, there are some AMAZING people working for web hosting outfits, but my point is they are working at web hosting firms in spite of their technical capabilities, not because of them.

To say Tesla is a crap-fest because they left a .gitignore in their public web-space is laughable. Tesla is a crap-fest because their stock is in the toilet, they often blow past promised delivery dates (cybertruck, anyone?) and are extracting cash from the rubes who believe "full self driving" means your car will drive itself in more than the most contrived of contexts.

Elon Musk is not an idiot because you can read a .gitignore from tesla.com. Having done business w/ Mr. Musk, I can assure you he is not an idiot. But he's also did not impress me as the super-genius many seem to make him out to be. He is not playing 4D chess. He's a reasonably intelligent guy who won the lottery (rich parents, older brother who cut him in for a percentage, met the right people just as the USG wanted to buy more launch capability and state and federal governments subsidizing electric cars.) If anything, he's uncanny in his ability to identify opportunity. Maybe that's even better than the Sili Valley execs whose skills extend to being white, pretty and GSB educated. (If you downvote me, please downvote me for the slight on the Haas School this last comment was intended to be.)

To recap... serving a .gitignore in your public web-space doesn't mean you're a dolt. It also means you're probably taking less care than you could. But maybe we don't need to take such care on a static web-site. But it does make me wistful for the days when competence was more obviously exhibited.

Elon Musk is considered a jerk because of his behaviour, not because someone in one of his companies left tesla.com/.gitignore in the public web-space. Tesla is not god's gift to American industry. It is a bit of a goose up the backside of entrenched incumbents, and for that I will always have a soft spot for it. Except for the bits where they seem to be a lightning rod for controversy which always seem to be unforced errors.

Good Day To You, Sir!


John Steven has a quote I quite like: "QA is making sure your software does what it's supposed to. Security is making sure your software only does what it's supposed to."

I think this is the lens the OP wanted readers to view this post through.


TIL "obverse", ty


> Elon Musk is not an idiot because you can read a .gitignore from tesla.com.

Now imagine what Elon would tweet if he discovered the same mistake on a Twitter subdomain.


[flagged]


Maybe the other way around - his ignore file has "Tesla" in it?


[flagged]


This is not Reddit, please don’t treat it as such


I think it's okay to criticize a company on Hacker News even if the leadership is particularly popular here.


Criticizing is fine, musty tropes less so.


*musky, but ok


[flagged]


yes, but *which* verification? original flavor? scammer's friend? or simp badge?


[flagged]


Use the full quote.

Nec audiendi qui solent dicere, Vox populi, vox Dei, quum tumultuositas vulgi semper insaniae proxima sit.

And those people should not be listened to who keep saying the voice of the people is the voice of God, since the riotousness of the crowd is always very close to madness.


Huh. Is there a name for when a widely quoted sentence fragment is used for the rhetorical opposite of the full original sentence? I feel like I’ve seen this happen before, but I can’t place where.

(Also, my favourite Latin to quote at anyone who quotes Latin: quin tu istanc orationem hinc veterem atque antiquam amoves?)


I'm almost certain that TVTropes has a trope for the more general case of quote use without considering the context, but I can't find it now.

My favourite is "neither a lender nor a borrower be", which gets trotted out as sage advice. It's a quote from Polonius in Hamlet, who is depicted as being an idiot.


We seem to have bottomed-out reply depth, but, to answer if there are other examples, the commonly misquoted "the proof is in the pudding" is the opposite of the correct one: "The proof of the pudding is in the eating."


My favorite "a few bad apples" is not a problem, which leaves out the second half: "ruin the bunch".

Especially when applied to police - the fact that the "boys in blue" turn a blind eye to the bad apples is what ruins the bunch. It's unintentionally accurate.


That full quote seems remarkably apposite, in the Twitter context. The fragment Musk quoted seems to mean the opposite, taken in isolation.


It's almost as if he's a world-class grifter who continually lies and whose entire net worth is predicated on keeping up an illusion of his own competence. The dude doesn't even have a physics degree, it's pure bullshit.


Agreed, what would HN be without it's pretension at being above plebs and their "humour".


I come here to get away from the Reddit crowd. The last thing I want to see is for this site to turn into... that dumpster fire. It's not about being above anyone, it's about setting a tone for a community.


I come here to read smart, creative people having discussions. And, sometimes, cracking jokes. I don't mind it at all.


They are the same picture.


not edgy enough for you?


if only someone would offer to take it private for 3x its value!


Time for a TeslaDAO. The cryptonomics are in the favour of us plebs, and I reckon we'd meme up a100z billions in no time.


I wonder whether he’s just taking revenge upon the employees for mocking him buying it for 3x its revenue. “SEC wants me to buy it whole because I have 9%? Employees and board are happy to force my hand? Ok.”

Being suicidal and ready to lose everything to make a point, is probably another facet of the same character trait.


you forgot 1 figure: 30x


actually... maybe 2 figures: 300x


Hilariously most people are unable to program in a secure fashion. https://www.zdnet.com/article/over-100000-github-repos-have-...

News about Tesla's security seems vaguely wanting, I do not know what this .gitignore file is about, but it is quite alarming enough to draw conclusions from.


Edit: failed to provide context, most people should be left out of writing public interfaces for non-hard real time systems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: