I use it regularly. Sometimes it’s broken, and maybe nobody notices but me? :)
Their natural language queries for things that I know they know about are amazing. Here are some that I have used recently. You really need to see these results to appreciate them.
the input language is less flexible than wolframalpha/google, but i quickly got used to it. it's nice to have something local and reliable. you can also define custom units.
i prefer using it in terse mode:
$ units -t 0.03$/hr*1month
21.914532 US$
$ units -t 10TB/month Mbps
30.421214
qalc[1] is also quite nice if you're looking for a command line calculator; it handles units well, but has some other fancy features, and has a very lax parser which i find to be a huge plus.
While APL dialects are very nice for this sort of thing, they generally don't understand units of measure or know about physical constants; you have to put those into them yourself. Here are some of my recent units(1) queries:
141 pounds force 30 mm # in joules
1160/4
log(3)/3/(log(2)/2) # how much more efficient is one-hot ternary than one-hot binary?
5V 7 μs / 7.3 A
.0117% half avogadro mol / 1.251e9 years / (potassium+chlorine)g # how radioactive is lite salt?
3.27$/gallon # in $/liter
sqrt(2 2000 electronvolt/electronmass)
18.8 foot pounds force # in joules
163$/(7.9 g/cc * 1500 mm 3000 mm 3.2 mm) # cold rolled steel price is higher than steel sold by weight
m3/4 / 15 cfh
2 pi sqrt(200 um / gravity)
Same here, the way it seamlessly wrangles even the most ridiculous combinations of units is insanely useful. Just yesterday I used it to calculate power consumption for a house by timing one of those spinning wheel meter things. Something like "(10 rot / 46 s) / (375 rot / kW*h)" and it gave me straight answer in watts.
I definitely could've worked that out by hand, but it would've taken a minute or a few, mostly on unit conversions. With WA, I can just think in variable relationships and not worry about units at all.
Don't get me wrong, it often returns complete garbage, see all the memes of Siri passing non-math questions to it. It's annoying to figure out or explain to someone because the syntax is very loose and you just kind of need to get a feel for it, but once you do, it's really powerful.
Google handles GMT and UTC but doesn't handle offsets from there and, frankly, it's understandable and I wouldn't bother either. What it does handle though is countries and their DST settings:
A regular complaint I have with google is (simplified example) converting EDT to MST. Google will “helpfully” correct me and convert EDT to MDT instead, which is explicitly not what I asked for. It’s stupid (I can usually figure it out on my own) but that would be a huge win for me.
Google is also often wrong. For example, my computer is set to US English, as is my profile. Yet somehow it still gets confused on decimals and commas (they are switched in my current country’s locale).
Also, if you use the Firefox search bar only the first 20 chars are sent to google, so longer calculations are truncated before they're calculated and wrong answers come back with no warning. Not a Google problem per se but a risk of using Google to calculate still.
or frink [1], which started off as a tool like the others mentiond here, but is now a full fledged units-based programming language. See some examples of them here [2]
The sandwich example was brilliant! I never expected that to be possible (the example of packing smaller circles in a larger one in another comment is also brilliant but less useful for me today I think.)
> 1 egg, two slices whole wheat bread, one slice of cheddar, two.. leaves of lettuce ..
and he said it's wrong and useless (!) - giving me examples and numbers as:
protein assimilability from bread is 40% etc.
Is there a way to get correct answers from Wolfram regarding this ?
(assimilability of doesn't work)
Edit: Excuse me, what's wrong with you downvoters - it's a legit question. Or is there something wrong with assimilability?
Are you happy being off with your answers by 60% - or jealous that a human can have better answers?
Wolfram isn't reporting how much protein you'll get from eating something; it's reporting how much there is in the bread. Protein assimilation depends on a huge range of factors, and varies significantly between individuals (based on everything from gut microbiome to health factors to how much you chew your food to your saliva production to... Well, it's a long list). There's no way a website could report the amount of protein you will get from bread. Reporting how much is in the bread makes much more sense. It's a shame your friend didn't explain that.
This is something that actually annoys me immensely when people say "you eat too much!" to fat people. Two people can have the exact same diet and the exact same exercise regime, and if one assimilates particular foods more effectively they'll be getting more calories, and put on weight. Food intake is far more complex than many people believe.
>This is something that actually annoys me immensely when people say "you eat too much!" to fat people. Two people can have the exact same diet and the exact same exercise regime, and if one assimilates particular foods more effectively they'll be getting more calories, and put on weight. Food intake is far more complex than many people believe.
I don't see why that statement is inaccurate. It's not "you eat more than me" but "you eat too much." As in you eat too much versus how much your body is able to burn of the calories it assimilates.
The problem is that it is usually presented as a "simple" solution. "Just eat less. Reduce your food intake until you're at a calorie deficit". For some people, that can mean eating three small, but satisfying meals a day. For others, it can mean eating extremely strict rations for only two meals a day, leaving the person constantly hungry and cranky. Then it becomes a will power issue, which as we all know is a function of brain energy reserves (right, we all know that, right?!). Throw in a mentally challenging job versus just phoning it in and it's really not actionable advice.
I agree that a dramatic change is very difficult and the level of difficulty varies from person to person. However, obesity is probably one of the worst long term health predictors. If it leads to diabetes, almost all outcomes get much worse. The change is worth the difficulty.
For me, I quantified what I was eating and simply reduced it a bit by careful tracking. I also did quite a bit of relatively low heart rate exercise and did do some shift of the calories away from carbs. I also identified some intake that was purely habit and not sustaining, like late evening snacks, and eliminated or modified those. Lost 35 pounds in a few months. It may take a while, but the math works over time. It is relatively simple, but it is not easy. I kind of turned it into a game and that helped a bit. At any rate, I wish anyone who decides to try the best of luck.
It's a willpower issue for 3 days, the time it takes for your stomach and appetite to readjust to a lower volume of food intake. Anyone who's fasted knows how easy skipping meals is--it's certainly not the agonizing test of willpower you and many nonfasters seem to think it is.
And by the way, if diet and exercise are not the path to weight loss, then what is?
You also don't need to eat that much less if you're at a stable weight. 10% less a day means you lose a pound every 1-2 weeks. In my experience people seem to not like it when you tell them, after they ask, that you lost weight simply by eating a bit less every day consistently for a year.
Trying currently to lose weight: the reason why I don't 'like' this answer is because I don't track my food intake closely enough to be able to know what removing 10% means.
So I guess that the first step is write everything you eat in a way you can monitor it, to be able to reduce it by a small amount if necessary..
Don't do a bunch of tracking: it's too much effort and you'll have a hard time sticking with it. Try 16:8 fasting (you can only eat within an 8 hour period each day). I also recommend reading this post to understand how the body works in terms of weight loss: https://karpathy.github.io/2020/06/11/biohacking-lite/
You don't really need to track your intake perpetually as I see it. But measure you weight weekly. Same time and day to better account for water/food/etc.
If your weight is not going down then try to eat somewhat less. Maybe skip a side or order a salad instead of fries or get 1% milk with your coffee. Or cut a potato from your dinner if you're cooking.
That said, tracking for a while is good to figure out what you can cut since you may not realize how much you eat (snacks, night snacks, soda, etc.).
In my case I stopped eating those free chips at work and stopped drinking a can of coke with lunch. I also tried to avoid large dinners but just large enough ones that I wouldn't go to sleep feeling hungry.
Diet and exercise are indeed not the path to weight loss. This is well known: most fad diets work this way in some fashion or another, and it’s well known that most fad diets fail.
Since it touches my field, physics, why people have this misapprehension, (“a calorie is a calorie” is an attempt at a thermodynamic statement) I feel somewhat qualified to talk about part of this even though I am not an endocrinologist or a nutritionist, they would have better answers for you in many other respects.
Thermodynamics is necessary but not sufficient to understand the problem. There are many physical problems with ending the explanation there.
The first is that it ignores equilibrium. So, the claim is that I can diet and exercise down to the weight that I want and then return to the lifestyle that I had before but maintain this new weight. That is, when you say diet and exercise you are talking about temporary interventions and no temporary intervention is going to permanently disrupt the equilibrium. Put another way, most people calculate a basal metabolic rate or total daily energy expenditure at their present weight, and leave it at that. If you're a physicist, you start to want to calculate it at two different rates, you want to see the slope between the two, so you get units of kJ/s/kg, but a kg of fat also maps to a certain number of kJ so this is actually a time constant of something like a year—some crude differential equations then suggest that the time constant is something like the half-life of your weight, so if you start living like someone who is 50 lb lighter than you, after a decent chunk of a year you will be 25 lb lighter, then 37.5 lb lighter after another... Basically just that we regress to a weight set by lifestyle. So the focus on an intervention is wrong. Instead one needs to focus on a whole lifestyle shift. You need to focus on setting a new equilibrium, not on burning calories.
But this is a really crude model and that gets into the second point, which is that you are assuming that the system is linear, like an electronic circuit made only out of inductors and capacitors and resistors. The problem is, it is not, it is in fact a complex system of feedback loops braided together. Picture’s worth a thousand words here,
Once you have feedback loops, there is no guarantee that changing the input voltage to an electronic circuit by 10% will reduce some voltage observed inside the system by 10%. It might, it might not. Changing a complex system requires a fundamentally different approach. Often to change one output, the entire system needs to be reconfigured.
As a direct consequence of this, it turns out that most people who go on diet plans hit “the wall.” At the wall, the feedback loops in your body are downregulating your basal metabolism and your perception of available energy. They are jacking up hormones that make you hungry, and also inducing you to wear more sweaters and other such things. They impel you to have “cheat days.” Part of the cause of this may be that your body does not know how to burn just fat. If your body runs out of energy it starts burning everything, both fat and muscle, to make that energy. As a result if you don't target your exercise and diet to build muscle, losing weight quickly actually can maybe drop your lean muscle mass, and your body is reacting to this global damage by telling you that you're sick, because you are. At least, that's one explanation I have seen, I am not a doctor and do not have any qualifications in this way. For all I know, maybe the body is using your fat to try to sequester some sort of toxin or pollutant from the environment, and suddenly dropping the weight releases all of this crap into your blood and that's the reason that your body suddenly wants to put on weight again. Don't ask me these questions
These sorts of feedback loops are why I would recommend listening to endocrinologists, the endocrine system is a signaling system in the body, so these people are very keenly aware of all of these feedback loops and how they reinforce each other. In his recent Metabolical, Dr. Lustig, a research endocrinologist, suggests that focusing on weight for health outcomes is actually totally backwards anyway, that there are more thin sick people than fat sick people in terms of absolute number, and that sickness should come first and wait is probably just a symptom that some people don't express. He gives some better advice about the benefits of healthy eating—studies where they kept calorie consumption and weight the same, and demonstrated huge improvements in health markers, simply by switching out sugary kid food for starchy kid food. Stuff like that.
The insight from complex systems is that telling people to focus on diet and exercise is deeply blaming and that blame might drive shame spirals that are causing the problem in the first place, which is again where I have to step back and hand the problem over to psychologists this time. Viewed this way the problem is that you have an unhealthy relationship with food, and it is unlikely that telling you to diet and exercise is going to magically make it a healthy relationship with food. Mindfulness exercises while eating could for example be a better option. Telling people to eat when they are hungry, but they have to put it on a plate and sit in a dining room and put away their phone and enjoy the food with gusto and stop when they are full: this might help with these binges.
> Diet and exercise are indeed not the path to weight loss. This is well known: most fad diets work this way in some fashion or another, and it’s well known that most fad diets fail.
I think you're misunderstanding what I'm saying. I'm not suggesting people go on keto or weight watchers. Those fad diets don't necessarily fail because they're ineffective, although they probably are--they fail because they are highly prescriptive and restrictive and it's difficult for people to actually execute the diet.
What I'm saying is that reducing total food intake for 3 days creates a lasting decrease in appetite. You can prove this to yourself by skipping breakfast for a few days: after a while, you will simply not be hungry at breakfast time.
I skipped breakfast (and lunch) for two years. I learned that I could push back against the hunger pain, but I was hungry, and my appetite did not decrease (I was hoping it would).
I mean, I could say the same thing about quitting cigarettes. I absolutely realize it’s hard to do, and that’s why so many people still smoke. But my advice would be the same...
The problem here is that you think you're giving people advice when really you're just telling them what to do. The difference is that advice comes with kindness, compassion, an understanding of how the advice is affected by someone's situation and context, and deep knowledge of the subject you're advising about.
Equating changes to diet for weight reduction to quitting cigarettes shows you probably don't have that.
... and that is also extremely bad advice. There is very wide variability in how nicotine addiction affects different people. Some people can quit after a pack-a-day habit and have no problems. Some people have trouble with getting off a pack-a-week habit.
Someone who is having extreme difficulty quitting smoking could benefit from working with a doctor to discuss quit-smoking aids or even seeing a therapist to work through their addiction.
No shit, the person needs to "just stop". Way to point out the obvious. Most people don't have a "just stop" button.
IDK, maybe you're just bad at giving advice. Maybe you should just stop.
EDIT: this is seriously an article on The Onion in the making. "Nation wakes up to random forum poster telling them to 'just eat less'. Obesity epidemic ends overnight." The proof is in the pudding here. Telling people "just eat less" is shitty advice.
People need to develop agency in actually doing something to lose weight which essentially comes down to eating less. It might be painful in the short term but is a huge benefit in the long term.
All I've found online is people giving excuses as to why one body type cannot do this or that, which essentially are the same reasons smokers give when trying to quit(too stressed, can't quit cold turkey etc).
> Someone who is having extreme difficulty quitting smoking could benefit from working with a doctor to discuss quit-smoking aids or even seeing a therapist to work through their addiction.
Oh—absolutely! As I said, it’s hard, and frequently requires professional help, strategies, etc.
But, it ultimately comes down to, you have to find a way to quit! You shouldn’t let yourself off the hook.
Is that true? I was actually under the impression that people's appetites do eventually adjust (especially if you reduce your intake slowly), although it can take years.
Except it is as I see it. The goal is to eat less and to achieve that you need figure out what you can eat less of that will make you still feel fed. Sure it doesn't apply to everyone but nothing does.
For example, for me, 600 calories worth of chips will keep me feeling fed for an hour or two. 600 calories worth of pure brisket can keep me feeing fed for 8 hours. You can guess which I tend to eat more of when I'm trying to lose weight.
edit: Also if you're at a stable weight then we're talking 10% less food per day and not 50% less.
For that to be true the base metabolic rate would need to be vastly different between individuals. While it is true that there is significant variation (according to Wikipedia more than 100%), most of that variation (60%) seems to be explained by differences in lean body mass, which is the other side of loosing weight, exercise. From those results I would argue the is little evidence that some people would have to cut dien to almost nothing while others could almost continue eating like before
This is wrong; the digestibility of gluten is 80-90%. Your friend was probably thinking of the PDCAAS, which is more like 45 for gluten. But this is nutritional quality vs an egg white equivalent as defined by the bioavailability and concentration of essential amino acids (egg = 100 by definition; the score is based on the lowest fraction of any EAA, so gelatin — no tryptophan — has PDCAAS 0), not the fraction absorbed or utilized. For an idea of what utilization looks like see e.g.:
> The net protein utilization is profoundly affected by the limiting amino acid content—the essential amino acid found in the smallest quantity in the foodstuff. It is therefore a good idea to mix foodstuffs that have different weaknesses in their essential amino acid distributions.
> The limiting amino acid for wheat is lysine.
From what I gather, you still can process all of the protein from wheat if you get lysine from somewhere else:
> A vegetarian or low animal protein diet can be adequate for protein, including lysine, if it includes both cereal grains and legumes.
This also means that any statements about protein utilization from compound meals are more-or-less bogus if done without calculating the different amino acids.
You can improve the protein assimilability of bread by combining it with a high-lysine protein (and I'm not sure but I think eggs and cheddar might fit the bill) but you may not care if you're looking for low-glycemic-index low-fat calories rather than amino acids specifically.
It seems that Wolfram Alpha also has some difficulty figuring out whether I'm talking about raw oats or cooked oats, even when I use the word raw in my query. As a result, it can be off by a factor of 3. I agree that it's not useful if you have to carefully check the output every time.
> You really need to see these results to appreciate them.
Seems more like the quality of the queries rather than the results. Many of the complaints I see about google and friends is related to them dumbing down search for the global common denominator.
Any advice on rephrasing it to work would be welcomed. Downside to allegedly natural language query systems - there's no concise explanation of syntax it recognises.
1. It's slow, even for simple microsecond computations like log(2). Takes about 5-20 seconds to load a page on my 1Gb fiber connection. Opening Python/SymPy Gamma is much faster for most things. https://gamma.sympy.org/input/?i=log%282%29
2. Every time I use it, a box saying
NEW: Use textbook math notation to enter your math. TRY IT
pops up over the result, and clicking the X doesn't hide it the next time I search. This adds ~3 seconds to the result time.
3. I'm a long-term Mathematica user, but typing literal Mathematica syntax usually never works, except for simple expressions.
4. Results are PNGs, and copy-pasting a numerical result takes a few unnecessary clicks. "Plain Text" > Copy.
> Takes about 5-20 seconds to load a page on my 1Gb fiber connection
Wolfram Alpha is implemented in Mathematica, which --- to understate the situation --- was never intended as a high performance backend server language. I suspect that's the reason for the bad performance.
"As a result, the five million lines of Mathematica code that make up Wolfram|Alpha are equivalent to many tens of millions of lines of code in a lower-level language like C, Java, or Python." [1]
Sure, there's something to be said for implementing logic in high-level code, but without a plan for lowering that high-level logic to machine code in a way that performs well, you're setting yourself up for long-term pain.
I doubt the bad performance is due to evaluating expressions itself. If I type N[Log[2]] into Mathematica, it evaluates in less than a millisecond.
It's probably because Wolfram Alpha is using natural language process to try to process my query and then finally deciding that by N[Log[2]], I mean N[Log[2]]. And it's probably not because of that, but because their grid scheduler isn't optimized for sub-second latency.
> Sure, there's something to be said for implementing logic in high-level code, but without a plan for lowering that high-level logic to machine code in a way that performs well, you're setting yourself up for long-term pain.
Whatever the reason for the performance issue (I don't know enough about WA to speculate what/why/how), I feel like noting the existence of the wolfram compiler[0] and the various language interfaces[1]. Anyone interested in using Mathematica/WL might get a kick out of exploring those more, at the very least.
Mathematica is extremely performant for most of the built-ins, the overhead of interpretation is nearly negligible for all but the tiniest operations.
There is also no reason to think that their request-response boilerplate is written in Mathematica, Mathematica is fully integrated with a lot of languages and runtimes.
> Opening Python/SymPy Gamma is much faster for most things.
Is there a way to make it plot multivariate functions? I tried but whenever I enter two variables it says "Cannot plot multivariate function." I've seen many Python packages plotting multivariate functions so I'm convinced it should be possible.
I usually use python for math stuff also, however I think the log(2) example is maybe the wrong example. I basically got an instant result for that (just recorded this): https://imgur.com/a/g5slHsR
Your Internet bandwidth is not relevant when talking about a compute-heavy backend like this. Wolfram|Alpha is not going to load any faster on a 1Gbps connection than it will on a 20Mbps connection, other than some static assets, but even that isn't going to be hugely noticeable if we're talking about 2ms RTT on fibre vs 8-20ms RTT on cable/DSL. If you're downloading a giant file off a nearby CDN, then sure, 1Gbps fibre is useful. I can max out my 1400Mbps cable connection downloading things this way (it's mind-blowing...), and my latency to my upstream gateway outside of my house is 8ms. But Wolfram|Alpha isn't going to load 40% faster for me than it will for you since it's I/O bound and your end-to-end latency is waiting for the backend to complete your request.
I will say, though, that Wolfram|Alpha could be "optimised" in the sense that it could do less fancy JS and be a simple box with a submit button, like SymPy Gamma.
I used it for Calc 1 and 2. It helped me check my work for Limits, derivatives, integrals, Reimann Summs, Series, Sequences. I love the part that says "Show Step By Step" because I can figure out which step I made an error.
The answers in the back of the book didn't tell me step-by-step how I solved the problem. It just gave me the answer and there are many times I couldn't figure out which step I made the error. Usually it was some dumb mistake, but by identifying the dumb mistake, I could remember to double check that similar step in future problems.
I had a hard time using it for Classical Physics to check my work.
Having someone, or a program, show you where you went wrong is a good way to learn nothing. All the learning comes with struggling when you almost have the answer.
I think the strategy of Wolfram Research has shifted from trying to sell Wolfram Alpha as a standalone service, to selling the Wolfram Language with WA functions for retrieving standard datasets. A finance professional, for example, probably did not gain much information from asking WA "would it be better to invest $100 in GOOG or FB in 2013?", but the `FinancialData` function for pulling end-of-day stock prices enabled these people to do interesting analysis that they couldn't have done otherwise.
(source: conjecture, but I did work at WR for 3 years and on the initial Wolfram|Alpha release)
Overall very positive, Stephen is a brilliant visionary and the software (Mathematica back then) was the best thing for someone early in their career to work on. Some of the ideas I picked up around symbolic computing and functional programming were quite helpful later on, and the whole experience opened some doors that wouldn't have otherwise. It's almost been a decade, so I unfortunately don't have much insight into sentiment these days.
For most of these, Wolfram Vertical Line Alpha seems to give reasonable results. However, for the third one, because I'm in Argentina, it helpfully converts US$79.80 into Argentine pesos, getting an answer that's off by about a factor of 2: AR$7969.44. As https://preciodolarblue.com.ar/ explains, the current bid and ask prices for the dollar are AR$195 and AR$199. Wolfram Vertical Line Alpha is apparently using the "official" rate of AR$99.45 or so; this is the rate at which the government converts your dollars into pesos if you are an exporter, but you cannot convert your pesos into dollars at this rate without special permission, granted, for example, if you are going on vacation to Disney World.
I absolutely hate this. Scanning a video is so much more difficult than just scanning a written explanation. If these videos are being monetized, I think that's a problem. If they are, someone could just create a channel by converting SO questions into videos.
The question I have is what kind of keywords are people using on these videos that Google feel is more worthwhile than the actual text of a written version of the content? Or is the algo so heavily weighted to pick a youtube link?
Having worked a stint in social media for 4 years there was this huge guideline from Facebook to push publishers to churn out videos. I suppose Google ranking algorithm favors Youtube but i don't get Google's reasoning behind that. Engagement because of embedded ads ?
Sometimes even this doesn't work. I used verbatim search and got back results which didn't contain the word I looked for.
I then just sadly wonder how the heck this could be possible and resignedly slowly shake my head.
I could wish for a feature where I double-double quote the word to empathically indicate that this word must exist in the result and not left out under any circumstances. But then again I am sure that the search quality will continue to decline and even double-double, triple-quote, quadruple-quote words &c won't help anymore. Sort of a quote inflation.
This is correct. In this query, the '*' is being disregarded. Then, I assume, more people on the internet discuss 48 and 6 in the context of long division than in the context of multiplication.
It is. Try searching for 16*9 for the good reason it shows both the calculator, and then links to 16:9 and 16x9 aspect ratio content.
It's reasonable to think that the calculator already answered the question, and I'm not looking for pages on the simple multiplication once I've already seen the answer.
Imagine the uproar if those results didn't come up because a bunch of children's math quizzes were found instead.
What we'll discover is there is a team dedicated to determining when to display the calculator. Then there will be another team entirely that picks how to interpret the query for website results. The two teams will never have met, spoken, exchanged information between the two. The team searching websites will mysteriously have never thought that someone might search a webpage for a math equation.
As a non-native speaker I would welcome more "grammar nazis" in places where well educated native speakers can be found.
One of the reasons children learn new languages quite rapidly is because they get corrected the whole time.
Not correcting people hinders actually their progress in language learning… Even if it might seem impolite it's the one thing that helps a lot, if not even most, in mastering a foreign language!
So thanks for being a "grammar nazi". We need people like you.
(No, that doesn't apply to the causal typo. But I guess most people can differentiate such a thing form true grammar and spelling mistakes; especially if that are "typical" mistakes).
No it isn't. "Lede" is a neologism arising from people convincing themselves they had inside information. It's been "burying the lead" as long as the phrase has existed. Your own link explains that.
The link says it can still be lead. Lede seemingly came about fo random reasons. I learned the opposite today. That I can still write lead instead of lede. M
For me, I never got into using it much (due to lack of experience with Mathematica syntax). I had some niche uses like "how many work days between <date1> and <date2>" but that's hardly so important.
Sympy live shell is decent, and the latex rendering is pretty sweet. But, it's on ancient versions of everything, runs slowly, and has a C- UI.
Instead, I use Colab with Sympy + latex output and matplotlib (and most other things you could want to import, pre-installed). It's running new versions of things, and backed by more power, with an option to pay for even more. The latex rendering took a bit of poking around stackoverflow, but works just fine.
What do you mean? I used it to solve a nasty impedance network for the real and imaginary components yesterday and the solutions were accurate.
Edit: Maybe it's just good enough that people treat it as a tool and see no need to market it. It consistently has worked fine-ish for years and is useful at what it does.
My meaning was just that I saw it sometimes referenced on HN, but I haven't seen it mentioned for a while now. Hence my search and results showing 8 years since.
I guess what I should be doing is looking at the Alexa ranking of Wolfram Alpha.
I appreciate the conversation around WA this Ask HN has started, but yeah you've basically completely answered the original question by pointing this out.
They're just serving up answers which is boring to HN readers. Where's the drama in collecting data privately? Where's the drama from censoring results? No drama == No interest? Gawd, I have become cynical.
Edit: Sorry, I don't know how to make the search query text show up since it has special characters, probably best to just use the links to see the query.
I'm unable to try to compute something similar by indicating the quantity and the percentage of alcohol, such as :
"2 beers (composition of 8% alcohol, 44cl) in 1 hour at 80kg"
I tried with or without parenthesis and with varying query. Never worked.
Any ideas?
(I'm interesting on knowing the level of blood alcohol percentage and the duration it takes to go under the limit, depending on the percentage of alcohol and quantity)
And it can't ever be accurate. Thing is, how drunk you're going to get within some time frame depends on what (and how much) you're eating, and whether your stomach had some food in it before you started drinking. If there's stuff in your stomach, its sphincter is closed while it's digesting it. Stomach itself absorbs alcohol (and nutrients) much slower than the intestine.
TL;DR: don't rely on this calculator to determine if you're too hammered to drive. If there's any doubt whatsoever, call an Uber or use public transportation.
This is something people from countries where people drink primarily hard liquor (Russia, Finland, much of Eastern Europe) know pretty well even if they don't know how to explain it. Another trick is to watch _what_ you eat. Meat and fat stay in the stomach for much longer, so if you focus on that, alcohol won't hit your liver all at once.
A more important question: what happened to Wolfram? I think they missed an opportunity to have an enormous market by pricing themselves into a niche. They had so much cool stuff that could have played a much larger role in most developers lives. And which would have funneled more users into higher end premium products.
Every now and then I go to their site to have a look -- and then realize that I'm not going to go subscribe to some piece of software I'm unsure I will be using enough to justify the cost.
Wolfram himself is working on physics and fulfilling a life long dream. (He was just on Lex’s podcast.) Say what you will about his contributions but it is hard to argue he hasn’t been enormously successful at achieving his goals of developing an entire cathedral of work he can use for his own intellectual persuits.
they have plenty of customers and are always hiring so I don’t think there’s much pressure to change their business model… they have a 15 day trial and IMO the ~$200/yr I pay for a dual boot personal license is worth it for the documentation alone, AMA
So the price is right on the limit for something I'd like to play around a bit with at home and "maybe it sticks". This is something I'd like to learn for the programming environment. But I'm not going to write software based on this that actually does something. I don't want to produce software that has a $200/year dependency. So then the amount of time I can invest goes down sharply.
I mean, good for them that they're doing well. They probably don't need the money. But their technology is highly unlikely to ever be part of any software I write. And it isn't because it is a bad fit. I do lots of stuff that would benefit greatly from having Mathematica plugged into it.
Thanks for the tip on the book. I might actually buy the paper edition and read that.
I used to use it extensively during my early PhD work for back of the envelope calculations. Unfortunately it became steadily harder to enter queries and have them understood. About half a decade ago they broke about 70% of what i used it for by refusing to show results for modestly complex calculations and instead throwing up nag messages for the paid version. The paid version last i saw was not available through an institutional license.
Last time I tried to use retrieval features for nuclear data there was absolutely no citation info or documentation whatsoever, just numbers from who knows where. WA had so much potential but peaked about 3 years after it came out as far as i can tell. That being said it's still vastly superior to doing calculations with google.
> The paid version last i saw was not available through an institutional license.
Does your institution have mathematica? In mathematica you can query WA directly, and it gives you as much (or possibly more, from how it seems to behave for me) computing time as people with WA pro subscriptions. I use it all the time for stuff like graphing complicated implicit 3d surfaces or doing multiple integrals, stuff where I know the relevant mathematica command but I would rather not type it out fully
The issue with recipe weight unit conversions might be that the author literally had a cup or spoon or whatever with a specific capacity which would not equal the standard units, therefore you are converting one inaccurate amount to an other one.
I'm not arguing in this case that it's more accurate, just that it's sometimes easier: if I've got a mixing bowl on a scale, I find it easier to pour things in by weight rather than to measure them all out. On recipes I often cook, I edit / write in the weight in grams to speed things up.
It's safe to assume any recipe written in the last 50 years is using the standardized units. It isn't literally 100% true, but close enough that it's not worth worrying about.
I used it a lot while pursuing my electrical engineering degree. It's ability to solve almost any mathematical formula and to show you the solution step by step is just plain awesome.
I guess it's safe to say I would not have passed some algebra and electrical engineering exams without it.
One tip I have (not sure if it still works though): Buy the Android or iOS app for a few bucks to get access to the step by step solutions if you can't afford the pro subscription.
The pricing seems random to me. (Even it seems cheaper than last time I've checked).
It's ~20% more expensive in Euro than in Dollar. (And Poland, which I checked for curiosity as it's in the EU but does not use Euro, has a price in Pound with is even higher; Poland is not a rich country).
Also I don't think charging for example people in countries in Africa as much as for example US people makes any sense.
The service is really great for some questions but the commercial offer never added up for me.
If the software would be OpenSource and run on prem I would consider buying some additional online services for it (even at the current random price point and without having a real use case; it's not more expensive than an average online game, so bearable). It would make also that "Wolfram Language" worth having a look at. But I don't bother even glimpsing at closed source programming languages. That's especially one of the things they do very poorly.
My guess is that it’s a bit too complicated/slow for a lot of ordinary people and too finicky for a lot of technical people.
I’m a frequent Mathematica user and I find almost all of my use cases require several different attempts to get the desired result w/wolfram alpha. Meanwhile, most people who don’t get the right result the first time will probably just give up and not think to rephrase the query.
I mainly use it as an english dictionary of math terminology.
Although for the basics of differential geometry like the Weingarten equations and the Dupin indicatrix WA is lacking - as is Wikipedia except for the articles in the german Wikipedia.
And I haven't found a way to get to the 'Weingarten equations' searching for 'Weingarten', you only find him by the full name 'Julius Weingarten'. :(
That's a bit recursive. I'd have needed english translations of the german articles to get to know the english math terms to be able to write about them in english :)
The problem WA is attempting to address is nearly impossible: to trust WA as a reliable source of information, you have to be confident it will be able to answer the question you're asking. If you work in a specific problem space, you can probably know that, but even if WA does know your particular area, you likely know even better ways to answer your questions.
Putting it another way, it's too hard to know what WA knows and doesn't know. I alluded to this in a post I wrote back when WA first came out: https://gcanyon.wordpress.com/2009/06/07/bing-wolfram-alpha-... "As Alpha grows and adds new problem domains it will become more and more useful, but it will continue to be necessary to understand what it can and can’t do, and how to get it to divulge what it knows."
Honestly, Google can now do most of the basic things that WA could do.
And the more complex things WA could do oftentimes require a bunch of trial and error to figure out the correct syntax/phrasing to use to get correct results, to the point where it was just easier to either do the calculation manually or find a dedicated site for it.
It's still around but I imagine it is experiencing a bunch of competitors biting chunks out of it.
A lot more people can script now, so open source packages of computer algebra systems (Sage, numpy, scipy etc.) Probably take a small bite.
And then you have closed source ones to consider like Matlab.
The second largest chunk probably being bitten out of it is its web and app competitors (desmos, symbolab, etc.) Alexa rankings show that these see a lot more traffic and engagement (2 - 3 times).
Finally, a small portion of its functionality is now covered by search engines. I imagine they'll continue to gobble things up. There are also a few good Web tools, I used one for a linear algebra course I found a lot better than the freeware version of WolframAlpha that came with my Raspberry Pi.
I can't find any reports on its revenue or net income. I would be super curious who uses it. Maybe it's growing... who knows? I also remember it being recommended a lot in the early 2010s.
I'm talking about both. When I was comparing them to competitors like Symbolab I was using the Alexa ranking for alpha.
I find it faster and more accurate to use a specific package in an interpreter than query Wolfram Alpha or use Mathematica. And for the simpler things a search engine will do!
I used to use it a lot but google now provides most answers as well and much faster. Wolframalpha performance is still sluggish and 6 second loading for a bunch of text (simple queries like `6cet to pst` is frustrating)
For me it stopped working several years ago and wouldn't ever return answers for queries. I futzed about with it to try and make it work; came back a few times over a couple of years as I had been a big fan. Just assumed they'd killed it somehow. Mentioned it on HN and others said it worked. For some reason it works again for me now -- it not working allowed me to discover Geogebra, which was nice and served a lot of my previous uses for WA.
I think students these days use it for math/calculus, but it isn't seen as something special because they've always had it. It wasn't novel like for us.
I was enthusiastic, but for medium complexity questions I spend more time footing with syntax then it would take to do it myself. I probably use it for a high complexity question once every few months. I’m happy that it exists, on balance
As someone who just signed up for an open university this semester, I'd love to hear opinions about Octave and Maxima for general purpose use. Especially for study, such as replacing Wolfram Alpha's step-by-step solutions.
I'm a Linux user and prefer an open-source solution. But I have no objection to paying a reasonable amount of money for a good commercial solution. Maybe Maple is worth looking at?
It depends on what your need is. I used Maxima (wxMaxima) for a quick prototyping of handwritten formulations and as a reference for some simplifications, roots etc.
Of course, its CAS capabilities are still useful. But I find that simplifications done on paper are often more straighforward, than making some expressions transform into the expected form in Maxima. Also, it's somewhat handy to have the ability to output the formulas in TeX format.
I vaguely remember Maple being more apt at expected simplifications.
Either way, I believe that Sage, Octave, Maxima etc. should be rather supplemental to textbook-based learning. In such way their results won't appear as pure magic, but as somewhat expected outcome of analysis.
I think it highly depends on what kind of math you're expecting (and how heavily you want to rely on a CAS). If most of your work is just simplifying equations and computing numerical solutions, basically any system will do whether that's Octave, Sympy + Numpy, etc.
I haven't used Maple for a while so can't speak to it's current functionality but there's been several times I've wanted to do something in Sympy/Octave and haven't found it whereas I can almost always get Mathematica to do what I want with a quick search. I tend to rely heavily on it for some more complicated/specific symbolic operations (e.g. symbolically transforming probability distributions) and for that use case, I haven't found anything better.
I'll also say that if your use case if more numerical/programming oriented, the language used might be an important factor. I personally don't like Wolfram Language and use very few of its language features and prefer Python for anything that Mathematica isn't suited for out of the box.
I prefer Python to Octave and Maxima. Numpy, scipy, and matplotlib for numerical stuff, and sympy for symbolic stuff. Having them together in the same general purpose language is really convenient, and Jupyter notebooks are fantastic. Sage is also good, but I've moved on to sympy. I don't know a way to get step-by-step working from a library, but sympy gamma can do some, so it's probably possible to some extent.
My experience suggests avoiding Maple like the plague. Sympy (and Sage) can do everything I ever used it for much nicer and easier.
$ TZ=Europe/Warsaw date --date=@1636221900
sáb 06 nov 2021 19:05:00 CET
$ TZ=Europe/Warsaw date --date=2021-11-06T19:05 +%s
1636221900
$ echo $(( ($(TZ=Europe/Riga date +%s --date=2021-11-05T17:00) - $(TZ=America/New_York date +%s --date=2021-12-05T09:00)) / 3600 ))
-719
However, this is super dangerous, because for whatever reason date(1) lies if you give it a nonexistent timezone, pretending that it understands you but actually giving you UTC:
$ TZ=Mars date --date=@1636221900
sáb 06 nov 2021 18:05:00 Mars
There's a list of valid timezones that you can conveniently browse with tab-completion after you spend 14 keystrokes to navigate there:
$ TZ=/usr/share/zoneinfo/Europe/
Amsterdam Berlin Chisinau Isle_of_Man Lisbon Mariehamn Paris San_Marino Stockholm Vaduz Zagreb
Andorra Bratislava Copenhagen Istanbul Ljubljana Minsk Podgorica Sarajevo Tallinn Vatican Zaporozhye
Astrakhan Brussels Dublin Jersey London Monaco Prague Saratov Tirane Vienna Zurich
Athens Bucharest Gibraltar Kaliningrad Luxembourg Moscow Riga Simferopol Tiraspol Vilnius
Belfast Budapest Guernsey Kiev Madrid Nicosia Rome Skopje Ulyanovsk Volgograd
Belgrade Busingen Helsinki Kirov Malta Oslo Samara Sofia Uzhgorod Warsaw
$ TZ=/usr/share/zoneinfo/Europe/Riga date
dom 07 nov 2021 06:50:44 EET
I wish I had a really good calendar math utility program that handled this sort of thing properly.
Maybe, and it wouldn't have to be as slow and unresponsive as Wolfram Vertical Line Alpha or obscure your answers as an attempt to upsell you, but I think it would still tend to have the same kinds of essential usability problems: a gulf of execution in figuring out how to phrase a query so the system would understand it, and a gulf of evaluation in figuring out whether the calculation it had carried out was the calculation you wanted.
It doesn't work so good for times but I often use Google search to multiply numbers with units together and get a result in the units I want without having to worry about screwing up unit conversions.
Example:
4 atomic mass units * (1000 nm/sec)^2
Google Result:
6.64215616 × 10-39 joules
I use this all the time. I use wolfram alpha for solving equations or systems of equations but I use google for unit conversions because it's got better input parsing (frankly).
I should try the wolfram alpha math entry mode probably, I think that didn't exist when I started using it. If I could manually enter the equations with stricter formatting to ensure it's interpreted properly I'd use it more.
$ units
Currency exchange rates from FloatRates (USD base) on 2021-01-17
3677 units, 109 prefixes, 114 nonlinear units
You have: 4 amu * (1000 nm/s)^2
You want: joules
* 6.6421563e-39
/ 1.5055352e+38
You have: ^D
It’s slightly less DWIMish (you have to say “atomicmassunits”, “atomicmassunit”, “amu”, or “u”, not “atomic mass units”) and somewhat awkward as a separate tool, but then resorting to your web browser for unit conversions is awkward in a different way. Non-interactive invocations, like units VALUE-OR-UNIT UNIT, work as well.
Alas, I often have to do these kinds of calculations on a random publicish computer or my phone and Google's converter is platform-independent. But not using Google services when feasible is certainly net good.
And of course my TI-89 had equally good unit conversion for practical purposes (since you can define your own units) so somehow the world is still playing catchup to a calculator from the 90s...
If you’re organized enough to have space for Termux on your phone, it does wonders in this department. I feel silly every time I punch Python code into that teensy touch keyboard, but damned if I know anything else that has a better input UI and isn’t orders of magnitude less versatile. (Maple Calculator and microMathematics are still on the “there was an attempt” level, in my experience.)
... Seriously, though, if you’re actually need this type of calculation regularly and didn’t just pick a random example, atomic-scale calculations are absolutely miserable to do in SI (and this is not a problem, it’s a human-scale, engineering system, after all; and its metrological aspects, which were the actual advance originally, are completely unimportant here).
If I had to do this in my head or with a desk calculator, I’d just do it in high-energy units (c = ℏ = 1, mass and energy in eV, length and time in eV^-1). So,
4 amu = 4 × 0.93 GeV (a proton weighs 939 MeV, an amu is slightly smaller due do binding energy, rounding to 1 GeV is good enough for most purposes) ≈ 4 GeV,
(1000 nm / s)^2 = (1e4 Å / s)^2 = (1e4 / 1.97 keV^-1 s^-1)^2 (an angstrom is a typical atomic size, a keV is a typical [large] atomic energy, a fermi aka femtometer is a typical nuclear size, a MeV is a typical [not so large] nuclear energy, remember any of 197 MeV fm = 1.97 keV Å = 1, though again 200 is almost always good enough) ≈ (1e4 / 2 keV^-1 s^-1)^2 = 25e6 keV^-2 s^-2,
4 GeV × 25e6 keV^-2 s^-2 = 4e6 keV × 100e6/4 × keV^-2 s^-2 = 1e14 keV^-1 s^-2.
This is slightly inconvenient, we wanted energy in eV, but the seconds don’t seem to want to go away. I don’t remember Planck’s constant in eV s, but I do remember 2 keV Å ≈ 1 and 300e3 km/s = 3e8 m/s = 1, so let’s sprinkle it with those,
The hardest part is pretending to be a normal person: you have to remember what an electronvolt actually is in normal units. Good thing this is numerically the same as remembering the charge of an electron in coulombs (1 eV = 1.6e-19 J),
0.44e-19 eV = 0.44e-19 eV × 1.6e-19 J / eV (turns out converting to a decimal fraction wasn’t a good idea after all, powers of two FTW) ≈ 4/9 × 16 × 1e-1 × 1e-19 × 1e-19 J = 64/9 × 1e-39 J ≈ 63/9 × 1e-39 J = 7e-39 J.
Good enough to a couple percent.
OK, I won’t pretend that this is easy or that I did it flawlessly the first time just now, but I do think this looks like a skill you could plausibly learn, unlike the textbook “SI all the things” calculation. The good news is that you’ve just seen essentially all the relevant constants you’re going to have to remember, except maybe Avogadro’s number if you’re going to have moles somewhere.
(One place where this doesn’t help is first-principles chemistry, things like electrolysis, because you need to subtract large binding energies to get a change that’s hundreds to thousands times smaller. Calculating things to a couple percent just isn’t good enough.)
Yes, I am familiar with this system. If anything, being a physicist is all the better reason to want a computer to deal with the units though...
My example was entirely contrived of course, a less contrived one would be estimating how long a gas cylinder will last. The tank name plate might say it has 200 cubic feet (sigh) and you need to flow at 10mL/min. How many months does the tank last? I'm talking about quick engineering tasks, not theory.
BTW, the answer is about 13 months, whatever that is in eV^{-1}:
It just can't answer the questions I have. Last time I tried it, I was looking for buoyancy of various gases, but it insisted any such question necessarily referred to stuff on water.
It did OK figuring the fake "temperature" of LHC beams that fusion people like to quote because they sound more impressive than GeV.
The name makes it seem like pre-beta test software.
I'm waiting for the final release, and then I'm waiting some more for it to be declared stable, and then I'm waiting some more for it to catch on and be declared popular.
Not really, but that's what the name suggests to me.
I just tried it here because of TFA and it's good.
W.A. shows the actual current distance to the moon (as of right now, 224,520 miles). Google shows this as 238,900 miles, presumably an average value, but it has no explanation at all of what the number is. W.A. also includes a graph showing the variance. And a lot of other info.
I needed to plot something real quick to see if I’m picking correct function. I had don’t remember how many data points but no much, under 200 I believe. W.A. told me to piss off and pay for it. So obviously I went and spent an hour more to learn how to do that with Gnuplot and did it with Gnuplot. Now I always go for Gnuplot right away.
I still use it all the time fore unit conversions, odd time based questions, etc. I find it's way better than the Google results because if I think of something after the fact I can tack it on and WA figures it out better than Google. E.g. "12 ft to meters * 3" is not handled right by Google but is handled how I want by WA.
When I'm making exercises to explain to my students in the math class, I use W.A. to double check the answer.
I also use it for calculation for comments in HN. Sometimes I need to make a back of the envelope calculation, and W.A. can convert the units and other boring stuff.
“Have you ever wanted to know when you turn 2 billion seconds old? How about 33,333,333 minutes old? When do you get to celebrate your 555,555th hour of life? As it turns out, all three of those milestones occur in the same 24-hour period!”
Wow Hackernews never ceases to amaze me, I enjoyed this. TIL I missed my 1 billionth second. You should also make a programmer mode, one that shows you powers of 2 (like 1024 days old)
I still use it regularly too - even more so after listening to Stephen Wolfram’s 3 part podcast [1]
with Lex Fridman where he discussed the latest developments in Wolfram, Mathematica etc
* ~33% - I mess up syntax, but believe it SHOULD be possible, and push much longer than I untended. Until finally settling on a partial solution, and wishing I knew more - but also recognising I should used a different tool e.g. excel
I use it often to calculate... Stuffs. They are great in calculating but I really need to envelope them with at least 6 layers of bracket for it to parse my equation correctly or it will use some weird notations/arrangements.
I used it a lot more often back in college compared to now. Usually now it's for random one-off calculations I'm too lazy to do myself, like "how many weeks since 4/28" (my puppy's birthday).
I still use it regularly. the app is my go-to calculator.
it just hasn’t been updated in quite some time. there are a lot of ways the back end could support new UI features etc., but something seems to be holding it back.
I still definitely use it for teaching big O comparisons in a live / malleable way. Not sure I know of a comparable resource for that but maybe someone out there in the HNstroverse does?
I teach high school math. My kids use it all the time! This isn't entirely a bad thing. It's a very, very useful (and natural language) symbolic integrator for them.
I still use it once in a while when I don't want to bother converting non-base10 units, like to know the date in 90 days, or how many hours in x days, etc.
Since nobody is mentioning it, around that time Wolfram Alpha started paywalling a lot of the more useful features. I used to use it in school and stopped when that happened. I'm not sure if they changed course since then.
This question reminds me of the time someone wrote to the letters page of a print publication (I think it was The New Statesman) asking "Whatever happened to the composer of the theme music for Trumpton?" (a popular children's TV series in the 1960s) and the composer wrote back saying "What do you mean whatever happened to me?"
To understand the state of Wolfram Alpha, you have to understand the guy behind it.
Wolfram Alpha was a pet project of Stephen Wolfram, the creator of Mathematica. He had grand visions for it. And for the first few years, it seemed like he was doubling down on it.
But then he got bored and started tackling a bigger problem: his own solution to the "theory of everything" problem -- something that has eluded the world's best physicists for decades.
But he was confident that he could best them all. Because he created Mathematica.
I'm not sure you intend it, but your comment kind of makes Wolfram sounds like some sort of crank.
He's a leading thinker obsessively interested in this idea that everything around us is the product of a simple, fundamental ruleset.
He's sitting on the bleeding edge of human knowledge where, honestly, everyone is at risk of being full of shit. Scientific consensus isn't really any kind of indicator of future breakthroughs.
He is both a crank and an innovator, and comments regarding his crankery are perfectly appropriate.
I think his "new kind of science" needs to be singled out from Wolfram alpha and Mathematica as especially crank-ish. It appears to be an attempt at a grand foundational philosophical statement, but it doesn't interact with pre-existing literature that covers similar territory, conveys ideas with pictures and informal statements without robust definitions, doesn't have an underlying bedrock of concepts or uniform vocabulary, and doesn't have the focus or clarity of purpose to rise to the level of being right or wrong. And it nevertheless maintains a grandiose tone of establishing an entirely new domain of science
It's not necessarily wrong, but it is unfortunately very vague and concerningly childish, even though I think it does have some meaningful things to say. It's a very fair example in favor of crankery.
He has done some exciting work, but he hasn't done any physics in ages. If you don't (by choice or ability) do the work to prove your ideas, you can't expect anyone else to. If he wants to revolutionize physics, he can't leave that to others. That attitude is a defining characteristic of a crank.
Cranks can do good work, but when they get out of their depth and don't realize it, blaming everyone else, that's when they become cranks.
I like Wolfram, and I think there are some interesting and fundamental insights in among the relentless self promotion, but ANKOS is a painful read, even though I find cellular automata a fascinating model.
They don’t make him sound like a crank, but like a narcissist, which Wolfram definitely is. Not that it’s really a bad thing, most lang devs are a little narcissistic in my experience. Comes with the territory. But the guy named his language after himself. No one does that! Creating a language is already a very ego-driven endeavor, but naming it after yourself is next-level egoist.
His work is very interesting, and much of it novel, but it's the way he presents it that makes him a crank. Claiming he has a grand unified theory of physics and all that.
For those interested in hearing about his theory/work, Sean Carroll (theoretical physicist) did a very long podcast with him about it.
I'd highly recommend Sean's podcast in general for those interested in physics topics and prefer a more technical discussion that the usual physics podcasts.
Their natural language queries for things that I know they know about are amazing. Here are some that I have used recently. You really need to see these results to appreciate them.
I wanted to know how tall my daughter might be.
http://www.wolframalpha.com/input/?i=8%20year%20old%20female...I wanted to know the nutrition content of an egg sandwich.
http://www.wolframalpha.com/input/?i=1%20egg%2C%20two%20slic...I was curious about the relative usage of two names over time.
http://www.wolframalpha.com/input/?i=Michael%2C%20Henry