Brownian thought space

Cognitive science, mostly, but more a sometimes structured random walk about things.

My Photo
Location: Rochester, United States

Chronically curious モ..

Thursday, September 28, 2006

The demise of neural nets?

I don't really like neural nets. Rather, I don't like neural nets when it comes to the human mind. The only reason that neural nets seem to exist is because someone saw their potential for computation. Of course, that potential itself has not been fuly realized; most "neurons" in typical neural nets are little points with rather simple I/O functions. Of course, most neural netters will tell you that the brain is full of neurons. The whole central nervous system is full of neurons. But really, there are ten times fewer neurons than there are glial cells, and they too are busy, nourishing, maintaining, organizing neurons and even participating in neural transmissions. And a recent paper in Trends in Neuroscience now says, "Astrocytic complexity distinguishes the human brain". I'm just waiting for the discovery of the role of astrocytes in setting up neural circuits, and participating in neural function! After all, the radial glia are busy directing neurons to the cortex.. how much role do they play in setting up the initial cortical specificities? Could be LOTS! And hopefully we'll be rid of the "neural nets do everything" culture. I just hope we won't usher in an "astrocytes do everything" era...

Microbiology class, 1993, Pune, India

Wednesday, September 27, 2006

Poeta fit, non nascitur

No, not a Latin course. It's a poem by the Great Lewis Carroll that I was re(re-re..)reading last night. The killer is in the last stanza; as anyone who's tried publishing papers will attest. (Project Gutenberg link here). POETA FIT, NON NASCITUR -Lewis Carroll "How shall I be a poet? How shall I write in rhyme? You told me once 'the very wish Partook of the sublime.' Then tell me how! Don't put me off With your 'another time'!" The old man smiled to see him, To hear his sudden sally; He liked the lad to speak his mind Enthusiastically; And thought "There's no hum-drum in him, Nor any shilly-shally." "And would you be a poet Before you've been to school? Ah, well! I hardly thought you So absolute a fool. First learn to be spasmodic - A very simple rule. "For first you write a sentence, And then you chop it small; Then mix the bits, and sort them out Just as they chance to fall: The order of the phrases makes No difference at all. 'Then, if you'd be impressive, Remember what I say, That abstract qualities begin With capitals alway: The True, the Good, the Beautiful - Those are the things that pay! "Next, when you are describing A shape, or sound, or tint; Don't state the matter plainly, But put it in a hint; And learn to look at all things With a sort of mental squint." "For instance, if I wished, Sir, Of mutton-pies to tell, Should I say 'dreams of fleecy flocks Pent in a wheaten cell'?" "Why, yes," the old man said: "that phrase Would answer very well. "Then fourthly, there are epithets That suit with any word - As well as Harvey's Reading Sauce With fish, or flesh, or bird - Of these, 'wild,' 'lonely,' 'weary,' 'strange,' Are much to be preferred." "And will it do, O will it do To take them in a lump - As 'the wild man went his weary way To a strange and lonely pump'?" "Nay, nay! You must not hastily To such conclusions jump. "Such epithets, like pepper, Give zest to what you write; And, if you strew them sparely, They whet the appetite: But if you lay them on too thick, You spoil the matter quite! "Last, as to the arrangement: Your reader, you should show him, Must take what information he Can get, and look for no im- mature disclosure of the drift And purpose of your poem. "Therefore, to test his patience - How much he can endure - Mention no places, names, or dates, And evermore be sure Throughout the poem to be found Consistently obscure. "First fix upon the limit To which it shall extend: Then fill it up with 'Padding' (Beg some of any friend): Your great SENSATION-STANZA You place towards the end." "And what is a Sensation, Grandfather, tell me, pray? I think I never heard the word So used before to-day: Be kind enough to mention one 'Exempli gratia.'" And the old man, looking sadly Across the garden-lawn, Where here and there a dew-drop Yet glittered in the dawn, Said "Go to the Adelphi, And see the 'Colleen Bawn.' 'The word is due to Boucicault - The theory is his, Where Life becomes a Spasm, And History a Whiz: If that is not Sensation, I don't know what it is. "Now try your hand, ere Fancy Have lost its present glow--" "And then," his grandson added, "We'll publish it, you know: Green cloth--gold-lettered at the back - In duodecimo!" Then proudly smiled that old man To see the eager lad Rush madly for his pen and ink And for his blotting-pad - But, when he thought of PUBLISHING, His face grew stern and sad.

Monday, September 25, 2006

Useful thesis advice

More generally, useful advice for writing reports...

Sunday, September 24, 2006

Kids' speech errors: A UG perspective

What explanations does a UG-supporter have for the errors that kids make? Here's a classification I made (derived from a talk of Paola Crisma at the Università di Trieste) 1) The main divide is between Competence and Performance. The primary difference is that Performance errors are less likely to be systematic than the Competence errors. (Of course, probabilistic rules are not excluded.) 2) Some cues might require maturational changes. So, no amount of evidence before a certain age will convince the kids to set certain parameters. 3) ...which brings us to the most interesting (personally!) view, that errors are Competence errors, and they come about by mis-setting parameters. This is a consequence of the Continuity Hypothesis, a version of which we heard in much of Lila's talks. I suppose the Subset Principle of Wexler et al(*) goes under this mis-setting account. However, over the last year and this, I heard little bits of stuff from here and there, which suggest a nice sub-classification of such mis-setting errors
  • Errors of omission. These are of the kind discussed by Nina Hyams, for example when you find pro-drop in a non-pro-drop language.
  • Errors of commission. These are probably the nicest! We heard some of this last year when Anthony Kroch visited our lab. One example of this is the finding that some English kids do not invert the auxiliary and the subject in wh-questions (so they say "What John had done?")

*Manzini, R. & Wexler (1987). Parameters, Binding Theory, and learnability. Linguistic Inquiry, 18, 413-444.

Saturday, September 23, 2006

2nd World Conference: Future of Science

Just saw the streaming version of Daniel Dennett's talk at the Second World Conference on the Future of Science :Evolution, being held at Venice. Some screengrabs: Dan Dennett: Steve Pinker asking a question (with Marc Hauser next to him) Philip Pettit (Princeton)


Here's an idea. Imagine a Wiki; which would be a collection not of scientific papers, but paper summaries. Advanrtages:
  1. Quick outline views of published (or to-be-published) articles and manuscripts.
  2. Allows discussion of specific papers by anyone in an easy, online way: one Discussion section for registered, university affiliated (past or present) scientists, a second Discussion for unregistered people, interested laypeople
  3. Links to full-text sources, including pre-prints
  4. Not limited to particular journals, so gives a broad coverage
  5. Searchable either in journal mode, in a hierarchical, Dewey Decimal-style mode, a standard meta-information mode, or a special, heuristic-based clustering mode


The main plan of the organization is something like the Dewey Decimal system. Except, beyond some (arbitrary) deep level, things would be clustered by meta-data and by the overlap of the references.
Reference-based similarities
The main idea is that if you are quoting the same people, very likely you are talking about similar things. But, if paper X is published in 2005, and paper Y in 1997, then naturally, part of the non-overlap will be just due to the fact that the dates are different. So the obvious thing is to first remove all references after the smallest of the most recent dates from both papers. So here would be an algorithm: Computing Reference-based overlap for two papers X and Y:
  • Find MinDate = Min{MaxX, MaxY}, the smallest of the most recent dates of the two papers.
  • List {Ref(X')} and {Ref(Y')}, all the references <=MinDate.
  • Find some metric of overlap; e.g.
RefScore = (common elements in the two lists)/(total number of elements in the two lists) Clustering would be based on RefScore, shared Author lists, and Keywords.

Friday, September 22, 2006

Sensorimotor cognition and natural language voodoo

Heard (yet another) talk with the kind of gargantuan leaps of faith that usually make me nervous and jumpy: "Sensorimotor cognition and natural language syntax" by Alistair Knott from down under Down Under. Unfortunately, it wasn't as I thought (from the title), a bootstrapping-like theory. Instead it was a just-so story from reading too much neuronal thingies, and identifying some kind of "cycling", coupled with Chomskyan-like Minimalist Syntax with something like "cycling" and then a little beating of drums, chicken bones and entrails and the hey presto! "Cycling" = "Cycling"! Which of course means that sensorimotor mumbo = syntax jumbo! Grrrr! Of course I like leaps of faith! That's what makes science so much fun! But personally this was something for the pub, with lots of beer, chips, interesting chicks and lots of paper napkins (for writing.. nothing to do with the chicks). Which, once the alcohol fog has cleared the next day and you have deciphered the hieroglyphics on the napkins, you try to come up with predictions and stuff and to go firm up EVERY little link and see whats what and whats not...

Thursday, September 21, 2006

Another People's Republic

Here's a t-shirt I made for Debora & her friends who all live in Roiano; (technically a neighbourhood of Trieste). I have my own as well, since I work there.

Sunday, September 17, 2006

Symbols, according to Huxley & Koestler

Came across these two quotes I'd jotted down in some old file. The central idea is pretty darn similar, although differently beautiful in the two cases... {warning: not for the connectionist-hearted ;) } 1. Aldous Huxley, The Island
Spiders can't help making fly-traps, and men can't help making symbols. That's what the human brain is there for - to turn the chaos of given experience into a set of manageable symbols. Sometimes the symbols correspond fairly closely to some of the aspects of the external reality behind our experience; then you have science and common sense. Sometimes, on the contrary, the symbols have almost no connection with external reality; then you have paranoia and delirium.
2. Arthur Koestler, The Ghost in the Machine
But man has an irrepressible tendency to read meaning in the buzzing confusion of sights and sounds impinging on his senses; and where no agreed meaning can be found, he will provide it out of his own imagination. ... The sensorium extracts meaning from the chaotic environment as the digestive system extracts energy from food.

Saturday, September 16, 2006

Baby constraints

List of constraints for word-learning in babies:
  • prosody
  • syntactic frames
  • phonological development
  • social cues
(why am I writing this?)

Friday, September 15, 2006

Dinner with the Gleitmans!

(ps: in Figure 1 above, the unexpected guest represents a certain individual with a self-described abhorrence of publicly available pictures of self. Ass.) pps: GIMP is amazing :)

With Henry G

Henry asked the question I dread: why do I do the specific (scientific) things I do. I said because (a) sheer curiosity (b) historical accident and (c) a capability to do such stuff (rather, an incapacity to solve equations or spend large amounts of time traipsing in forests and such-like). Later at dinner he expanded on the theme: some people are like those that want to get to the south pole at any cost (like Judit)... others do not (me). But in either case, what is essential is a love for the subject; being careful that the "love" is not promiscuous :) Well, so there is, I think a first inkling of a response to why I do what I do: the deep love is in human nature, corny as that might sound...

Henry & EC Tolman

Henry told us a lovely quote from his mentor, Edward Tolman: "I now believe that anything of interest in human affairs (with the possible exception of language and the super-ego) can be understood by studying the behaviour of a rat at a choice-point in a maze". :)) Translated into contemporary language, the exceptions of course represents the two most puzzling and deeply human facets: language and morality. Another historical gem: According to Henry, as students, psychologists modeled themselves after physics. According to him, in retrospect, psychologists should have modeled themsleves after biologists instead: Darwin instead of Newton.

Wednesday, September 13, 2006

Amazing people

Lila Gleitman and Henry Gleitman. Amazing scientists, amazing thinkers, amazing people!

Tuesday, September 12, 2006

Phoentics vs phonology (again?)

Here is something that came out of Paula Fikkert's talk. There has been this distinction between phonetics and phonology and acoustics aplenty. For example, what is the unit over which TPs are computed in fluent speech? Syllables? Consonants? Jacques thinks that it might be over something acoustic, while at the other extreme other believe it is over phonological representations. Clearly, very young infants, who have all the phonetic contrasts possible must use those for computing TPs. But the problem is, you might have a certain feature in your language, but it might be completely useless in terms of the lexicon. Like Paula suggested, the feature [coronal] is redundant for German. You NEVER have two phonemes which differ only for coronality. The same is the case for aspiration in English. It's clearly there. It's very relevant for prosodic phonology; since voiceless stops like /t/ are aspirated when they are foot-initial. But does aspiration contribute to the lexicon? Not at all! So, here is a proposal: ALL features are available for the infant, but in adults you retain only those that (1) form minimal pairs in words and (2) are used in prosodic processes. I've always maintained that what you hear and how you respond to it is task specific. As a consequence, if you are trying to spot words, or somehow your word-retrieval system is tickled, you WILL throw away acoustic information. Now I think this can be extended: not only do you throw away acoustic information, you also throw away phonetic feature information that is irrelevant. I now remember saying this to Adrienne Fairhall a long long time ago.. that the only reason you would not be sensitive to allophones is if they did not change lexical items. So perhaps Jacques is the closest: it might be that even adults compute TPs, for a foreign language, over the universal featural repertoire (except [coronal ;) )

Making sense of Stager & Werker (1997)

We had a great mini-conference with Lila Gleitman, Paula Fikkert, Anne Christophe and all of us. Paula Fikkert's talk was probably the one with the most amount of new stuff, since Lila and Anne we are all much familiar with. The best thing about Paula's talk was that it finally made very very clear the most perplexing paper in acquisition, the Stager & Werker (1997), Nature. In this paper, 12 month-olds were able to discriminate /b/ from /d/, but when "taught" than an object is called /bin/, they cannot distinguish /bin/ from /din/. Stager & Werker proposed that, while phonetic detail might be perceived, using it for defining lexical items might be a much harder task. So, by 14 months you finally have enough resorces to succeed. Paula replicated the failure of 12m.o.s to distinguish /bin/ from /din/ in a word-learning situation, the success at distinguishing /b/ from /d/ in a non-word-learning situation, but then showed that the same infants were able to discriminate /bon/ from /don/ in a word-learning situation! How come? Well, they start with the observation that young infants (a) produce words such that the feature on the vowel spreads to all the phonemes (b) coronality is under-specified. So, in the mental lexicon, wherever there is a coronal feature, nothing is entered. Since /i/ is coronal, and since the vowel feature spreads out, nothing is represented, so there is nothing to mismatch.

Wednesday, September 06, 2006

Lab meet with Lila Gleitman: The origin of Schmlanguage

Lila G has exceedingly interesting things to say and has a super-rich conception of the problems. As a bonus, she is very very witty, for example starting her discussion on observational learning of nouns with the yogiism "You can observe a lot by watching". One of the nice things about yesterdays talk was the clarity of what is meant by "language". I think it is quite clear both in and out of Chomskyan-style syntactic approaches that there is a lot of stuff happening with just the lexical items alone. There is no more a stark distinction between the syntax and the lexicon. As LG said, knowing the verb "give", implies knowing both that there is a certain subcat frame, and also that giving is composed of the giver, the given and the givee. So when Luca objected that word learning might not be core-language, but something else, LG said ok, let's just say I study schmlanguage. Schmlanguage. Looks like I study schmlanguage too! In fact I rather think that the term is useful, in that it moves away from the syntacto-centric view of language (a la Ray Jackendoff), towards... well, schmlanguage :)

Jerry Fodor and giraffes

Lila showed us this lovely quote from Jerry Fodor, which just shows why he makes stuff so much fun to read, while capturing the essence of the problem:
To be sure, a photograph may show three giraffes in the veldt; but it likewise shows a family of giraffes; and an odd number of Granny’s favorite creatures; and a number of Granny’s favorite odd creatures; and a piece of veldt that’s inhabited by any or all of these.

Saturday, September 02, 2006

M. Tomasello and usage-based theories

Came across a cartoon which rather reminded me of the whole usage-based theories like those popularized by Michael Tomasello. Essentially, in usage-based theories, very young (less than 4 years olds) do not have the kind of linguistic competence that adults do. As an example, while an adult might (in her/his mind, (if (s)he has one)) interpret the sentence "Mom shuts the door" as [N]subjNP-[V]V-[D-N]objNP, a young kid would only see this as something like [Mom]N?-[shuts]V?-[the]?? [door]N?. That is, nouns and verbs are tied to each other, so that kids will not readily go from "Mom shuts the door" to "Johnny shuts the door", since the kids don't really learn about NPs and VPs, but about words like Mommy and shuts, and all they see is that Mommy comes before shuts. What about adults? MT quite readily accepts that adults do have NPs, VPs and all the rest. So how does the child go from Mommy to NP? Here's how the answer looks like Here's an excellent debate between Stephen Crain and Mike Tomasello at the BUCLD 2004. Judge for yourselves.