Wow, very sad to hear this. My high school ran a trial project back in 2000 and my class became the first in my country where every student received a laptop to use in all classes. We used Scientific Notebook for math. Being able to solve any equation and draw any graph with the click of a button was a huge boost in my ability to grasp trig, polynomials, logarithms, etc.
I actually went from D-level math student pre-HS to graduating HS with an A+ and went on to get a degree in physics+math at uni. I honestly doubt if that would've been possible if not for SN.
Way back when, just after the IBM PC came out in the early 1980s, a company named Triad Computing was formed by J. Mack Adams, Roger Hunter, and Barry MacKichan, with the goal of creating a WYSIWYG technical text editing system. I was their first employee. Roger, Barry, and I did all the programming. Venture capital for this sort of thing wasn't common then; we raised operating capital by contract work (e.g. I wrote a floating-point arithmetic library for the PDP-11). After about a year our product debuted, named T^3. Shortly after there was a company name dispute and we changed ours to TCI Software Research. I could go on at some length, but that's where MacKichan Software started.
When I went to Oxford University in 1985 I learned that, at the time, theses in the mathematics department were required to be written with T^3. I did not advertise my background--I did not want to be tech support. I actually went back to TCI for a while after finishing my degrees there.
As a founder/part-owner, like the other two he kicked in some money to get it started. Despite being a computer science professor he wasn't an effective real-world programmer so he in effect became a silent partner. He eventually got cold feet and his initial investment was bought out by the other two. (The other two ultimately went all-in and effectively gave up their professorships; J. Mack stayed in academia.)
Now that's a name I hadn't heard in a long time. It wasn't my cup of tea -- I prefer just writing latex directly, but it's a pity, as I'd rather see more people get into writing tex/latex, even if via GUI.
I understand why it's not that mainstream outside certain academic fields, though, because it can be pretty annoying frustrating at times. But almost anything is better than word at writing long technical documents, specially in a collaborative way. (Went through that once for a grant proposal; very painful).
I used Scientific Word to write my dissertation and my first academic papers. It was a great piece of software in the late 1990s, at least relative to the alternatives, which were writing .tex files yourself or typing in a word processor. I have a coauthor that still uses SW.
I can't say this is surprising. I moved to LyX long ago, and then after markdown became popular I ditched LyX and went with that. There are lots of strong alternatives today. There probably aren't many people willing to pay for something they can get for free, that they already have, or that doesn't provide features for collaboration.
How does Markdown fill the niche of TeX or Scientific Word? Is there a mathematically oriented fork? You mentioned R Markdown in another comment, but I don't see anything in the gallery that looks germane.
Pandoc includes math support out of the box. R Markdown is actually built on top of Pandoc. I know many academics that write their papers in (Pandoc) markdown and then convert to PDF. It creates an intermediate .tex file that is then converted to PDF.
pandoc will turn markdown with TeX formatted equations into PDF or html (relying on tex and mathjax, respectively). It can also turn it into TeX if you need to tweak stuff.
A number of other systems like Obsidian (for note taking) use markdown and integrate MathJaX or KaTeX to render TeX formatted equations.
In 2021 there's honestly not any good reason to know about Scientific Word. It was a good product 20 years ago, but it doesn't bring much to the table today that you'd have a reason to pay for.
It made it easy to create latex documents without having to type latex code. It was especially handy for creating equations and tables. I don't know that it's missing anything today, it just doesn't provide a lot of value over something like LyX (that is open source and doesn't cost anything) or even a good markdown editor.
LyX is very popular. TeXmacs is not exactly latex, but it does a good job. Things built on markdown like RStudio/R markdown. Many like Overleaf, though I haven't used it much. Even MS Word isn't terrible these days.
Our course instructor on "Estimation and Detection Theory" was a big fan of Scientific WorkPlace. It did such a great job at replacing the chalk-board during lockdown.
That's a shame... science/math focused software is such a niche, and the people inside that niche(I imagine) _really_ appreciate that type of software.
There could be some type of consulting opportunity here though for licensees that are still wanting to pay for that type of software. Depending on their use case, a consultant could try to shift what they currently are dependent on to a semi-customized collection of jupyter notebooks etc.
"If you need to install your software on a new or different computer, you will need to re-activate the software on that computer using that serial number. [..] This contacts the MacKichan Software licensing server, which we will keep running for at least two years."
The hope:
"We expect to make Scientific Word an open source product eventually. Since both Scientific WorkPlace and Scientific Notebook contain the proprietary computer algebra system MuPAD, they cannot be made open source. When the open source project for Scientific Word is established, an announcement will be made here."
> Since both Scientific WorkPlace and Scientific Notebook contain the proprietary computer algebra system MuPAD, they cannot be made open source.
Don't you just hate the viral nature of infectious *proprietary* code?
To co-opt some propaganda used against GPL in the past:
Using a proprietary library in your application mandates that significant swaths of your application be released proprietary as well. That's not a problem if you wanted to release proprietary software to start with, but it's a problem for anyone else on either side of the spectrum.
For this reason, developers have to be quite careful about how they use proprietary software to make sure that it doesn't impose unwanted licensing restrictions.
> Don't you just hate the viral nature of infectious proprietary code?
Sure, I agree with you.
I'm currently looking for a term to replace "infectious" and "viral" since I want to talk about the GPL without bad health connotations. Any suggestion?
Would be nice if they open sourced it now that the company doesn't exist anymore. But it seems to never pan out and people just lose access to the thing they paid for.
It's a pity it's called "TeXmacs" when, as you say, it really has no connection to TeX (except the ability to typeset mathematics).
Another alternative is LyX, which actually does use LaTeX. Even that is not really a "LaTeX editor" though: it produces/exports excellent LaTeX, and uses that for typesetting, but does pretty poorly at importing LaTeX (as the developers freely admit). Depending on your situation that may or may not be a problem: if you just have to give a journal the final result in LaTeX then you can use LyX to produce it, but if you want to collarboratively work on a paper with someone else who wants to use raw LaTeX then it wouldn't be suitable. I wrote my masters thesis, course notes, and PhD thesis in LyX and would strongly recommend it - although it's a GUI I would argue it has a stronger learning curve than raw LaTeX, but leaves you more productive in the end.
As far as I know, an importer that interprets arbitrary LaTeX cannot exist (i.e. you have to run LaTeX on the file) because the grammar of LaTeX documents is Turing-complete. There is a StackExchange Q&A where this is explained, but I do not understand the arguments ;-)
LaTeX imports and interprets arbitrary LaTeX, so other programs can theoretically do just as well. Importers running up against Halting Problems occur in many document formats, but this is not an issue in practice since most documents are not nefarious, and any importer can (and should) have sensible timeouts for parsing.
No, document formats are not turing complete. HTML for example can be parsed without any problem. However take this TeX snippet "\foo [bar]" and tell me if "bar" is an argument of "foo" or not, without running the macro (which maybe you do not have. This is the basic problem any TeX conversion program encounters. TeX is not a document format but a programming language. The only thing you can do with a TeX program is to run it. On the other hand an HTML document can be parsed without knowing anything about its content.
I once upgraded an EPS interpreter that was used as the key piece of an EPS importer. I added a couple of PS things like 'arcto'. It seems to me a LaTeX importer could work as an interpreter, too. Probably not very much fun after a while.
Graphics is different since usually the graphics primitives are the common substrate of all graphics languages, so the interpreter can just be adapted to a different backend. For more structured content this is not possible: the natural output of TeX is a sequence of graphics boxes for the glyphs, or something like a DVI format. It is too late to retain semantic informations, e.g. about mathematical formula. Ask anybody which writes LaTeX/HTML converters for example. TeXmacs does a good job to interpret "usual" LaTeX files but you cannot expect to be able to interpret arbitrary files and still retain semantic informations, e.g.: sectioning, references, emphasised texts, theorem statements, change of languages (e.g. from English to C++ to Math...) etc...
> For more structured content this is not possible: the natural output of TeX is a sequence of graphics boxes for the glyphs, or something like a DVI format. It is too late to retain semantic informations, e.g. about mathematical formula.
I think this applies to the natural output of TeX's "mouth" as well, i.e. one could exapnd LaTeX macros till one obtains TeX primitives, but at that point the information on structure is already lost.
> LaTeX imports and interprets arbitrary LaTeX, so other programs can theoretically do just as well.
Right, but the output is a typeset document. The fact that LaTeX (the program) can do this is no evidence at all that it would be possible to build a program that converts LaTeX (the format) into some intermediate representation that make sense to a human. For example, assuming that macros are recursively expanded in place, it's possible that there's no clear dividing line between "this is a user macro that should be expanded before exporting" and "this is an internal macro that should not be expanded" (e.g. you probably want to leave "theorem environment" from AMS package unexpanded because that is semantic information you'd want when you edit the document). As another example, what should be a table or equation etc. might end up as individual lines and characters floating around as separate elements on a page.
That makes sense. Also (and I realise this is a bit less deep than undecidable grammar) I've certainly seen LaTeX documents with so many random macros that I don't recognise the content of it at all! I'm sure we've all come across those. I can understand why a program wouldn't stand much of a chance.
Personally I never really needed LaTeX import so I don't really see it as a problem, but I know it matters to some people.
On the "import from LaTeX" Joris van der Hoeven advanced the argument that the impossibility of writing a general import file from LaTeX allows LaTeX to maintain its status as the main document preparation system for scientific articles.
In fact, if you could import arbitrary LaTeX, an user of a different system could collaborate with LaTeX users (probably a form of "conservative conversion" would be necessary for back and forth translation between document formats).
>> It's a pity it's called "TeXmacs" when, as you say, it really has no connection to TeX
Yep. It sounds like something that ties emacs and LaTeX together. That's not something I'd even want to look at, and have not for exactly that reason. My goto math tool is Maxima, but I don't write so typesetting isn't a thing for me - it does support MathML though IIRC.
I understand that "the python programming language" probably doesn't involve snakes. On the other hand, "the TeXmacs math typesetting software" doesn't obviously not involve LaTeX and isn't obviously not for emacs.
It is a bit discomforting seeing that discussion in hackernews just remain at the level of "bad choice of name, btw!" without no technical insights. I would have expected more sensible reactions. TeX and Emacs were initial inspirations to TeXmacs (which has been written in the '90). As such they indeed inform some of the design choices, that said TeXmacs do not conform to them. The Emacs legacy is however very clear: all the editor behaviour is re-programmable in Scheme, from keybindings to menus, to the graphics editor (see here a recent example http://forum.texmacs.cn/t/why-not-draw-circle-by-center-and-...). As for the TeX legacy, TeXmacs actually improves on it by defining a document format while retaining the power of creating user defined macros, it also allows scheme code to be called from macros, since its beginning, 20+ years before LuaTeX. An overview of TeXmacs design can be found here: https://texmacs.github.io/notes/docs/overview.html . I think people nowadays are more used to programs which are epsilon changes over existing systems, repackaged with brand new names. That is why TeXmacs could be puzzling at first. Does not fit usual categories. Maybe this says something of the current trends in (free)software design.
Overleaf has decent tab completion and dozens of templates, as well as limited support for markdown. Let tabnine and copilot get to LaTeX; I think we’ll find the discoverability of “text language on Adderall” outguns GUIs.
> I think we’ll find the discoverability of “text language on Adderall” outguns GUIs
No way. Nothing can outgun typing a parameter into a box rather than the visual clutter of braces (or some other delimiter). What's more, maths notation often involves objects of different sizes, which isn't reflected in any fixed-width text language. Equations in my PhD were often 80% about the subscripts/superscripts if you read the raw LaTeX, with the most important stuff buried away; but when you look at the equation as rendered, or in a GUI, the most important bits are trivial to read off.
> lyx as a hard to get into GUI
I only suggest LyX is hard to get into because some people only get into it at a surface level, which it's actually very easy to get into but then you don't get all the benefits of LaTeX. Best case scenario is you know both LaTeX and how to use LyX, then you can spend 1% of your time writing bits of LaTeX (in the preamble, or using math macros [1], or directly including the odd snippet of raw LaTeX as a last resort (even then put them within an instant preview)) and 99% of your time using the GUI.
My main "for" point towards GUIs is that one is able to read back easily what one wrote. On the other hand, to read comfortably a LaTeX equation I have to compile it; then for editing I have to move my attention back and forth from the compiled document to the source. (corrected a mistype)
It's mainly tables and to some extent figures that are intrinsically harder to read when written in TeX. The bulk of the text is just text, so if that's hard to read that's down to your editor settings.
(by the way the formula may contain some math mistakes---it is supposed to represent interface conditions for a system of waves in a nonlinear optical medium---but it is good enough for showing the editing point)
The obvious objection is that of course most people do use macros. But that's the point: every LaTeX document ends up being its own impenetrable language.
> you don't write complicated mathematical equations quickly if you've got any sense anyway
I don't get this line of reasoning. Writing equations is hard, so what difference does it make if you make it a lot harder? I think the fact it's hard makes it even more important to make it easier to write!
Proofreading / editing (as opposed to writing) is even more severe. When using raw LaTeX, there's absolutely no way you can be sure the equation is right without checking the PDF, so you end up in a slow loop of typesetting (the whole document!), read something for a while, go back to the LaTeX to fix something you've found, spend a while finding where the problem is, etc.; in LyX you're just organically reading and editing the document. If you spot an incorrect subscript, you just click on it and fix it then carry on reading. In raw LaTeX that can 100 times as long, or even longer still if you take into account the context switch of your mind.
I hadn't even considered the potential power of TabNine in LaTeX! That could be a gamechanger for routine stuff.
And I second your endorsement of Overleaf. I had to write a research proposal a few weeks back for work after not touching academic writing for 7 years. Overleaf made the process relatively painless.
Interestingly, the "X" in LyX is not pronounced the same way as the "X" in TeX/LaTeX. Would have been a nice pun, but they probably didn't want to have any "resurrected"/"undead" associations.
OTOH, Wikipedia informs me that "Lich" is actually pronounced "ˈlɪtʃ", so the pun wouldn't have worked anyway. I was pronouncing the "ch" like the one in "Leiche" in German until now. Oh well...
I doubt it, since in academia there are two gods, Word and LaTeX. It is very rare to find journals/conferences who don't specify one template or the other (with the possible exception of ACM, who are a law unto themselves). TexMacs might be OK for final preparation of camera-ready PDFs but by that point you're already invested in Word/LaTeX so there's no point switching typesetter.
I think that there are good points in switching document preparation system. Differently from LaTeX, TeXmacs can be programmed in a sensible way (it uses Scheme), and differently from Word you have access to the source. It is superior enough to both systems that the future standard will work according to the ideas implemented in it, I think.
Writing math in Word has the following disadvantages:
- referencing is a nightmare (this is probably the worst aspect). \label and \ref or \eqref? No, remember the equation number, click the References tab, Cross-Reference, select "equation" from the "Reference type" dropdown list, scroll down to the equation whose number you should remember, click "Insert". Did you add remove/an equation, thereby changing all subsequent equation numbers? Let's hope "Update all fields" works on the first try (hint: it won't).
- AMS math standards, \mathbb{R}, \mathcal{R}? No, \doubleR, \scriptR.
- \bar{x}? No, \bar SPACE SPACE left-arrow x right-arrow.
- \begin{align*}? No, good luck. (Actually I haven't even figured this one out).
- Version control? No, "Track changes". Oh, you changed a subscript or superscript? Your file is corrupted and cannot be saved, Ctrl-z until you can save and hope you remember all your changes.
- Seamless math typing if you are fast and proficient at LaTeX? No, Alt+= and hold on while I catch up. Did you go too fast? Sorry, Word has stopped responding. (This is a real problem in a 50+ page Word document full of math.)
Those are just the ones off the top of my head.
+1 for TeXmacs being superior to both Word and LaTeX. After using LaTeX for years I tried TeXmacs for a week and never looked back.
The secret is to revert back to Draft/Normal mode and abandon the performance regressions from print preview mode. Word was never designed to use that as the primary editing view.
It's also just a tiny subset of LaTeX commands and not real LaTeX. I use the bm package to make things bold and I can't use that in Word. It doesn't support \begin{} and \end{} for any environment so for stuff like matrices you need to use special Word syntax \matrix{}.
The math equations made by Word have a distinctive appearance. Partly because they’re objectively worse in some (a few) aspects, partly because people associate them with Word, the result feels amateurish. This isn’t purely an intrinsic technical quality of Word, it’s just how the cookie crumbles.
> The math equations made by Word have a distinctive appearance.
Try and use the GUST fonts (http://www.gust.org.pl/projects/e-foundry), Microsoft Word equations will look then excellently made.
The last time I used the GUST fonts, by the way, I was able to export Word documents to pdf only be printing, as using the pdf converter some glyphs would "lose pieces"; and exporting via printing I would lose the clickable links. But I did not try with the Word plugin for the latest versions of Acrobat.
In TeXmacs, switching between math and non-math mode while writing up a document is seamless and fast. Moreover, typesetting quality is comparable to TeX/LaTeX.
You do not know what are you talking about clearly. What is "this powerhosrse difference"? What in your opinion is possible with one tool but not with the other?
It's a unique kind of business that commits to doing substantially more development work after announcing they have ceased sales with no plan to restart.
Upon reading this article I thought I should check the fate of one of my favourite astronomy software (SkyMap). Looks like it had the same fate last year, Chris Marriott (the sole developer) decided to retire. I don't really get though why shut up shop and not open source such one man projects :(
Good to see that they at least plan to open source an old version before they shut down. This does happen sometimes, for example Synfig Animation Studio had this happen.
Hmmm I've never heard about this software but I found some screenshots in google search, the UI looks a lot like another old CAS that I used in high school, "Derive" [1]. Derive was great :-)
I wish that when companies such as this do go out of business that they should transition the licensing model to perpetual.
It is gracious to keep running the licesnse server ut once it ends that is it.
If the company is not going to lose any money because of it, which they will not since they are out fo the busness entirely, let the users have a chance to keep using it.
(without support, without bugfixes, without upgrades to new platforms)
Presumably this could lead to people sharing licensing detals and non paying customers may over time start using it, that should be an acceptable risk, given that there is no monetary gain nor loss.
It would be awesome to have access to a rich library of dead software that people could use for free.
One issue you hit, and it sounds like they may be hitting here, is the need to pay yearly royalties or other yearly licensing fees for some of the components in their software.
The company that they are paying fees to is unlikely to give them a a pass.
In these cases, which are not uncommon, there is not a lot they can do.
(Lots of companies have a mix of perpetual and non-perpetual users and pay royalties on the yearly folks and a one time fixed fee for the forever folks)
Or the software, as written, NEEDS to contact a server to ask the question "am I legit" - and to remove this need would require modifying the software and redistributing it.
Of course, if the companies funds run dry, they will stop renewing the url registration, and one of us can register it, throw up a server that says "you are legit!" To every request and the problem is solved.
That “one of us will register it” is unlikely to happen. It’s easier to run a local server that handles the url or to hack the executable to ignore the check.
>I suppose there's only so long a one person company can go on for.
These were my thoughts exactly.
These people got a an entire career out of independently working on software they (presumably) enjoyed developing. And if they open source it, the product lives on. It's not sad, it's great! Some of us should be so lucky.
My father (67, retired) has a couple of friends his age with long time, small businesses in the IT industry. One has been making some sort of map software for a very small niche in the marine industry, another has been making software for bee keepers or something similar. A third in image scanning software. They have been doing well since the late 80s, and still could be. They chose to go out of business and enjoy retired life instead.
I suspect that, although there is a market for the products, noone is interested in taking on the software they worked on.
> I suspect that, although there is a market for the products, noone is interested in taking on the software they worked on.
It may be extremely difficult to do that, for at least a few reasons.
In some cases, the tech is going to be very niche/old, and it will be difficult for people to get up to speed on it.
The business is likely built on a lot of personal contacts, and not all will want to switch to some 'newcomer'.
Many of these smaller business reliant on niche software will go out of business themselves, or be acquired by larger companies that will replace the niche software and process with something else.
The current customers may not actually want anything other than regular updates while paying minimal support/maintenance contracts. Someone established can live off that, but someone new will have to spend a lot of time learning, and the income may not be sufficient to justify all the learning.
I worked on a project in 2018 where some of the (small) company was still running on a combination of foxpro/db2, with a bunch of custom code by an indie/solo vendor who had 'retired' probably 5 years earlier. He'd 'sold' the company to another individual who ... kept it going, but couldn't easily deal with new needs (new reports, etc). Another vendor did an upgrade on the hosting server, and nothing worked after that. The upgrade was a base NT upgrade, and there wasn't any easy way to 'go back' quickly. Things ran off an RDP session to a laptop in Canada running some weird trial emulation tool under windows 7 (this was how it was translated to me from various parties).
Well, if I may, there is something that doesn't sound (to me) "rational".
I have seen quite a few of these cases where a larger company buys the old one (essentially to get a list of their current customers) and then terminates the product replacing it with some crappy new stuff that usually completely fails to work, but in this case the soon-to-retire programmer at least gets some (little) money (and knowledge is lost forever).
But if the one man company is going to shut down because the programmer is going to retire, the acquisition cost for a young, willing programmer is 0.
This hypothetical young programmer could - I believe - invest some time to understand not so much the actual codebase, but rather the workflow of the program and re-write it along that same workflow in a new language/platform/whatever.
I am pretty sure that those niche users would be ready to pay a fair amount of money to have something modern/updated that actually works and works like the old one.
What I have seen often is that the new program, for no real reason, has been written by someone that most probably is much more brilliant at programming but that knows nothing about how the program is actually used, has no idea about how to deal with some "edge" cases (that already surely happened in the tens of years of life of the old software), etc., in some ways it is like all the experience accumulated over the years is suddenly lost and the new program repeats the same (or worse) mistakes/issues that already happened (and that were already solved).
Maybe the problem is that there is not an easy way to tell to the world "I am going to retire, any taker?"
> This hypothetical young programmer could - I believe - invest some time to understand not so much the actual codebase, but rather the workflow of the program and re-write it along that same workflow in a new language/platform/whatever.
> I am pretty sure that those niche users would be ready to pay a fair amount of money to have something modern/updated that actually works and works like the old one.
I don't know. My experience is, for many smaller niche things, they're entrenched in orgs and used The One Way, and any deviation - change a button label, add a menu, etc - will result in a lot of complaints from existing users. They'll have to 'retrain', etc.
No doubt some people will appreciate and welcome 'new/modern' stuff, but many won't. And figuring that out ahead of time is... time. and effort. And along with that, there's usually heaps of institutional/domain knowledge that just can't be replaced without... time in the trenches.
It's not that it's not possible, it's just not typically 'worth it' for most people. ROI is too low compared to other options.
> I don't know. My experience is, for many smaller niche things, they're entrenched in orgs and used The One Way, and any deviation - change a button label, add a menu, etc - will result in a lot of complaints from existing users. They'll have to 'retrain', etc.
People hate this just as much in almost any software (or, most any UI, physical or virtual, for that matter), they just often don't have a way to push back.
You wouldn't know it based on current trends, but consistency and predictability are king in user interfaces, as far as actual usability goes. So much so that high levels of severe bugginess can be preferable than less-severe and common bugginess, if the former is consistent and predictable ("if I press this button then that one, the application will crash or glitch, every single time, no matter what state the program is in—so, I won't do that") and the latter isn't ("about once a day this button takes me to the wrong screen, and the behavior seems random"). If your users are your top priority, changes will occur gradually, and only with excellent motivation. Grand re-designs are among the most user-hostile things you can do (despite their popularity).
[EDIT] to be clear, they don't have a way to push back in the modern age of rolling updates and old versions being infeasible to obtain at all, possibly broken even if you can, and, most likely, full of known vulnerabilities that will never be patched. In the old days of desktop software that you actually purchased, and that operated just fine entirely offline, the way to push back was not to upgrade, and it was common.
Yes, that young, entrepreneurial programmer can be found, but won’t have the domain knowledge. Look at the examples above. You can’t sell Marine mapping software without knowing something about marine navigation. You can’t sell beekeeping software without understanding beekeeping. Plus you need the general business operation skills. Finding a programmer who knows beekeeping and wants to take on a low growth business is not as easy as finding a programmer on Upwork.
Yep. There may be 'takers', but will the existing customers want to work with them? So many solo/indie niche packages are built on the relationships, and replacing those - and the trust around them - is hard.
A client told me about someone they knew who did ballroom dancing software - it kept track of competitions, standings, etc. And... it seemed like decent money, looking at the pricing, and the size of the market. But the market didn't seem big enough for multiple players, and everyone trusted/knew/used the one main player. If/when he goes (or perhaps already has), I'm sure people will find another way to manage their stuff, but before then... who's going to come in to a market like that? How do you 'beat' the incumbent? Lower price? Who would switch? How do you convince people to switch to something unknown, potentially losing years of data, having new training costs, to ... save a couple hundred bucks maybe?
I'm sure there's hundreds of these sorts of services out there that are surviving, but don't have a huge market for competition, because the barriers to entry are too high relative to the return.
Yes, but I was talking more about "passing on" the knowledge/experience (as opposed to losing it and start anew).
Regarding your "The One Way", sometimes that one way is actually the one that works better, as it was developed and fine tuned over the years by a dedicated programmer that had constant feedback by users.
Probably that ballroom dancing software had a way to input (or present/output) data in a form that makes sense to the users.
When suddenly comes the new (otherwise brilliant) programmer that - knowing nothing on the specific field - invents his/her own way to input data or render it that the users find awkward or slower or less intuitive or whatever, with the new program that cannot import old data (or imports partially), that cannot use the same B&W printer because the output is highlighted with colours and not with bolding/underlining, etc..
No surprise that the users of the old software (if it is still working) won't jump on the new bandwagon.
If it is due to retirement, the other employees might not be able to take over the product if it has been a 1 person show. I know of similar software where if the main developer were to leave, I have zero confidence they could do much besides maintenance. It isn't just the code, but having the mathematical understanding of the problem.
Also, without looking at this in general, I wonder if it's hard to compete with Mathematica which is ~$1000 for a professional license (very reasonable in the industry compared to most others) or free on raspberry pi with free or near free for most students. Also, there is open source software as well that could eat into their niche.
Mathematica is at least ~$1.5k/year, ~$3k if you want only the desktop application, no WolframAlpha-like features. Licenses on the desktop are not per user, but per machine (8 core limitation). If you want to move the installation to another machine, you have to send them a signed document and contact Wolfram (the company, not the person). I could go on all day about the friction and restrictions.
They have an excellent and unique product and I would love to pour them money (between me and my employer we pay about EUR 1000 /year to Intellij, and this is only one provider). But for some reason I do not feel comfortable between the immense power imbalance between them and myself. It makes me uneasy to invest in the ecosystem.
I wonder how many clients they have their CAP users in the academic world.
Fully agreed on the friction. It is quite annoying if you have a license, but it's not widespread at your company. The core restrictions are rediculous as well for such an expensive product.
I use it for things I need, but similar to you, I try to stick to other tools (Python for me) just to avoid the frustrations.
They can move a license fairly easily, but there ended up being some other annoying issues for me recently that made me want to wash my hands entirely.
"We expect to make Scientific Word an open source product eventually. Since both Scientific WorkPlace and Scientific Notebook contain the proprietary computer algebra system MuPAD, they cannot be made open source. When the open source project for Scientific Word is established, an announcement will be made here."
It looks like they just hope for it to be continued as open source - except for the part The Mathworks Inc. owns anyway
One person businesses often require a unique collection of skills to operate and don’t make enough money to hire a team of professional operators. This is also true of traditional small businesses like restaurants and car mechanics. The phenomenon is discussed extensively in the “E-myth” series of books. The TLDR is if you ever want to get someone else to run your business, you have to be careful to systematize and document how it works.
I relate to this. 1 person company means getting the job done. As soon as you have 2 people you really need 3. One to do the labor, the other to schedule and bill(receptionist). If it's a physical company you need more space. Some software charges by person too.
If something gets cancelled, the 1 person company takes a few hours off. Your 3 person company still needs to get their weekly paychecks.
Whenever I hear calls for raises to minimum wage to absurd(?) levels (20$/hr), I think that this company will never be able to grow past 1 person. You can pay for a receptionist if you are a big corp with many employees, but it's really hard when you are going from 1 person to 3. We'd even let the receptionist do full WFH and just answer phones so they can watch their kids.
"I guess we don't deserve to be in business".
And the big corporation who gives 60% of our service conquers the industry.
People buy restaurants and car shops mainly because of licensing such as alcohol and zoning laws. It's rare for a specialty shop to be sold unless the owner has an apprentice.
It’s more common if the specialty shop has expanded to a chain. Because the same skills and tactics it takes to make a chain business also make a salable / maintainable business.
Pretty sure restaurants and car mechanics aren't unique skills and when they retire in my area someone usually buys them out and continues operating under the same name...
From what I recall from reading one of the "E-myth" books a while ago, whether the newly-owned shop will operate similarly to the old one, and also how difficult was it for the original owners to sell it, will depend on the documentation GP mentioned.
Going from my memory, "E-myth" book(s) insist on thoroughly documenting operating procedures and turning them into playbook. Rigid in terms of personnel training, but open to change if someone figures out an improvement. The goal is to be able to scale up operations into multiple venues (franchise models) while maintaining highly consistent appearance and level of quality.
As much as having such documentation will make it easier for a franchisee to get the business running (the books call it "turn-key business", IIRC), it will also make it easier for a potential buyer to take over and immediately hit the ground running.
The properties already converted for specific use, so it is cheaper to reuse as much as possible.
But those will be different businesses. The restaurant will not make the same dishes or retain the same customers. The mechanic will operate on different types of vehicles.
> Is it due to the market, or the owners are retiring/giving up?
Bit of both.
Certainly people who tripped into a niche back in the 1980s/1990s are getting old.
However, modern customer support has gotten stupidly expensive and annoying as hell. I blame the fact that modern users are significantly less sophisticated and significantly more demanding.
The previous comment made a joke of how despite being a capable device for education, it didn't have the features that drive commercial success. How was it unsubstantive?
The bar for a one-liner joke like that to count as substantive is high—it needs to include something genuinely new and clever or else it's just internet fodder, and will encourage others to make even worse posts. The GP comment was way too obvious and predictable to clear that bar, which is why it led to a lame subthread (mostly now flagkilled).
The gold standard HN explanation about this was by scott_s many years ago:
The sibling thread I was going to reply in has been deleted, but I wanted to share what I learned this morning, as a young-blood who was hardly alive at the time:
a. Laptops of the time sport specs that assure me they'd be perfectly capable of proforming the majority of tasks I engage in on my computer, and education especially sets a low bar; I'm sure they could run a terminal, CLI text editor, calculator, interpreters, and Wikipedia like any modern PC (thanks to Wikipedia for staying accessible!). Selfies simply aren't the most important thing.
> The released in 1993 SGI Indy [sic] is the first commercial computer to have a standard video camera
> The first widespread commercial webcam, the black-and-white QuickCam, entered the marketplace in 1994 [ . . . ] $100
> The first widely known laptop with integrated webcam option, at a pricepoint starting at US$ 12,000, was an IBM RS/6000 860 laptop and his ThinkPad 850 sibling, released in 1996.
> Around the turn of the 21st century, computer hardware manufacturers began building webcams directly into laptop and desktop screens, thus eliminating the need to use an external USB or FireWire camera.
So, yes! Idk exactly what OP had, but if laptops of the time were starting to include built in webcams, and could run a 20fps webcast, then they could certainly "take and share selfies and video". Interestingly, 2000 was also the year of the first Presidential Webcast in the US, though I'm pretty sure Clinton didn't stream it from his iBook.
I actually went from D-level math student pre-HS to graduating HS with an A+ and went on to get a degree in physics+math at uni. I honestly doubt if that would've been possible if not for SN.