Hacker News new | past | comments | ask | show | jobs | submit | grandpa's comments login

The title is "Excuse me sir, would you like to buy a kilo of isopropyl bromide?"

https://archive.org/details/gergel_isopropyl_bromide


That's a hilarious book: "There was a jar which I had not noticed before containing potassium metal. I knew that potassium was a silvery metal, but this was one inch spheres, green with the oil in which they were immersed. I removed two for a collection of elements we were starting at Columbia High, scraped off the oil and put the marbles in my handkerchief which I added to a collection of miscellaneous glassware in my back pocket. "

Oops... I can see where that is headed.


Thanks for the rec! Gunna dive into it tonight!


  git clone --branch start https://github.com/djkoloski/anima_solver
  Cloning into 'anima_solver'...
  fatal: Remote branch start not found in upstream origin


This is fixed now.


The opening of Borges' essay is absolutely brilliant writing:

> AT TRIESTE, IN 1872, in a palace with damp statues and deficient hygienic facilities, a gentleman on whose face an African scar told its tale — Captain Richard Francis Burton, the English consul — embarked on a famous translation of the Quitab aliflaila ua laila, which the roumis know by the title, The Thousand and One Nights. One of the secret aims of his work was the annihilation of another gentleman (also weatherbeaten, and with a dark Moorish beard) who was compiling a vast dictionary in England and who died long before he was annihilated by Burton. That gentleman was Edward Lane, the Orientalist, author of a highly scrupulous version of The Thousand and One Nights that had supplanted a version by Galland. Lane translated against Galland, Burton against Lane; to understand Burton we must understand this hostile dynasty.

Borges also admires Burton's footnotes:

> Thus Volume Six (which I have before me) includes some three hundred notes, among which are the following: a condemnation of jails and a defense of corporal punishment and fines; some examples of the Islamic respect for bread; a legend about the hairiness of Queen Belkis’ legs; an enumeration of the four colors that are emblematic of death; a theory and practice of Oriental ingratitude; the information that angels prefer a piebald mount, while Djinns favor horses with a bright-bay coat; a synopsis of the mythology surrounding the secret Night of Power or Night of Nights; a denunciation of the superficiality of Andrew Lang; a diatribe against rule by democracy; a census of the names of Mohammed, on the Earth, in the Fire, and in the Garden; a mention of the Amalekite people, of long years and large stature; a note on the private parts of the Moslem, which for the man extend from the navel to his knees, and for the woman from the top of the head to the tips of her toes; a consideration of the asa’o [roasted beef] of the Argentine gaucho; a warning about the discomforts of “equitation” when the steed is human; an allusion to a grandiose plan for cross-breeding baboons with women and thus deriving a sub-race of good proletarians. At fifty, a man has accumulated affections, ironies, obscenities, and copious anecdotes; Burton unburdened himself of them in his notes.


With `. ranger`, if you change directory and then exit, the shell's working directory changes to where you left off in Ranger.

  $ pwd
  /home/grandpa
  $ ranger
  [ ... change directory ... ]
  $ pwd
  /home/grandpa
vs

  $ . ranger
  [ ... change directory ... ]
  $ pwd
  /home/grandpa/src


Got it. Thanks.


This is really good fun! But it would be nice if I could finish up the problem I'm on when the timer runs out. Currently, it just gets taken away abruptly.


Yeah, it should let you finish the problem and alternatively show the answer once you're done.


You could test this by looking at whether pairs of fraternal twins are as similar on personality traits as pairs of identical twins. If they are more different, then genes are the cause.


There's been lots of these twin studies. Most of what we know about genetic influences has been determined this way.

https://en.wikipedia.org/wiki/Twin_study


You cant objectively measure the difference between two people.


Why "infamous"?


I have no idea why I wrote that, bad wording.


I think this is the same technique as Russ Cox used to search Google Code: https://swtch.com/~rsc/regexp/regexp4.html


This is mentioned in the article. (and links to rsc)


It parses "fruit flies like a banana" the same way as "Time flies like an arrow".

https://en.wikipedia.org/wiki/Time_flies_like_an_arrow;_frui...


Similarly, it seems to fail on "The old man the boat", marking "man" as a noun. The meaning of the sentence however, in this case, is fairly unambiguous, but parsing it can be tricky. See other: https://en.wikiped.org/wiki/Garden_path_sentence


In some sense, it's a mark of success for an AI system to fail in the same way that humans do. "The old man the boat" is a terrible sentence, essentially ungrammatical.


Can a sentence be terrible? In what way is it ungrammatical?


That's a good example as to why this tool should probably have the option to output a sample of the top-N guesses.

Some sentences are just totally ambiguous without context. "Fruit flies like a banana." isn't even good English. Is the sentence trying to say "Some particular fruit flies like a particular banana"? Or "All fruit flies like any banana"?

By the way, Spacy creator - how's the NER coming along?


Spacy's implementation, assuming it's roughly equivalent with the one syllog1sm blogged about, just does a greedy incremental parse so it only produces one candidate parse.

It is possible to do incremental dependency parsing with a beam, but all the copying of beam "states" is expensive and there are no guarantees that the n complete parses in the beam are really the n best parses w.r.t. the model.


Yes, I do greedy parsing. There are many advantages to this, which I'll write about before too long. Fundamentally, it's "the way forward". As models improve in accuracy, search matters less and less.

By the way, the beam isn't slow from the copying. That's really not so important. What matters is simply that you're evaluating, say, 8 times as many decisions. This makes the parser 6-7x slower (you can do a little bit of memoisation).


In that case, I wonder if it can output a probability score for each tag at each position, like pycrfsuite does? Then the output could be ensembled with other taggers, or otherwise pass that confidence information downstream.

Also, maybe a dumb question - is there any library or best-practice method for the ensembling of taggers / chunkers? Or must I create it myself from scratch?


How do you expect fruit to fly?


Definitely, but I think Oxford University is more interested in teaching the underlying principles than tracking what's popular in industry.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: