ID:274072
 
I'm not in the habit of saving everything I post to the Forum on my hard drive, but it just so happens I came across this one. I think I posted it a day or so before the new forum was updated and erased all the old messages.

It was provoked by a message in which Tom revealed that the great minds of Dantom are trained not as programmers, but as physicists! I've edited the question a little, but it's basically the same.

So, O physicists of Dantom, enlighten me!

=====
I've wondered about this one for a while...

I know very little about physics, but I've read some "pop physics" books. As I understand it, there's a double-slit experiment, in which light behaves as a wave and produces interference lines kind of like the patterns cast by Venetian blinds; and there's a cloud chamber experiment, in which light behaves as a particle and leaves little trails. Light behaves as a particle when you're observing it, and a wave when you're not.

So, here's my proposed experiment. You set up the double-slit experiment, but the light source has to pass through a cloud chamber to get to the two slits.

If observation really influences the behavior of light, then when you cover up the viewing window on the cloud chamber, the light should create interference lines, but when you remove the cover, you should be able to see the particle trails, and thus there would only be two lit areas beyond the slits.

So, is this what would happen?
On 6/21/00 5:34 pm Guy T. wrote:

If observation really influences the behavior of light, then when you cover up the viewing window on the cloud chamber, the light should create interference lines, but when you remove the cover, you should be able to see the particle trails, and thus there would only be two lit areas beyond the slits.

So, is this what would happen?

Good question. I am not a QM expert by any means, but I can give you my take. As I see it, it's not the fact that you are observing something that causes it to change state, but the process of observation itself. In other words, in order to observe something (like which slit a particle travels through), you must hit it with something else, usually an electron or a bunch of photons. It is the reaction of these scattering entities that we use to judge the position of the incident particle. However, when dealing with microscopic quantities, this hitting process itself is significant, because the momentum transferred to the incident particle is not negligeable. This is known as the uncertainty principle, and basically means that in order to gain accuracy in the position of something, we must lose accuracy in its momentum. How does this affect what you see? Well, when the incident particles are not disrupted (through observation), they will behave in tandem like a wave, interfering with themselves to form the pattern on the screen. When they are disrupted, the change in momentum disrupts the interference and no pattern results. I've sort of fudged over this last stuff (why do they interfere in the first place?), but its been a while (and I never fully believed QM myself :) You've got me thinking, though, so I'll read up on it.

So, for your question, I don't think it matters whether we are looking at the chamber or not. Presumbably the cloud chamber is making its own "observation" to detect the trail, so there should be no pattern on the screen either way.

How's that take?
In response to Tom H.
How's that take?

Wow, I can't wait to take physics when school starts again...! =)
In response to Tom H.
So, for your question, I don't think it matters whether we are looking at the chamber or not. Presumbably the cloud chamber is making its own "observation" to detect the trail, so there should be no pattern on the screen either way.

How's that take?

Sounds plausible to me--of course, you could probably have made up anything and it would sound plausible to me. :)

I wonder, then, how the experiment results would change if you changed the contents of the cloud chamber... for example, if you slowly changed the contents to approach normal air. Assuming you had another double-slit experiment (without a cloud chamber) sitting right on the same table, that experiment would continually show the interference lines; so as the air quality changed--

(Oops, sorry, have to cut this short for now! Power outage... the UPS won't last long.)
In response to Spuzzum
On 6/22/00 12:42 am Spuzzum wrote:
How's that take?

Wow, I can't wait to take physics when school starts again...! =)

- g -

Makes me regret dropping out of high school... I just buy the text books and read them on my own, but I rarely get to discuss what I read with anyone.

I just saw two physics books on one of my book shelves but it's been about ten years since I've opened them. Makes me want to run home and get them :).
In response to Gabriel
On 6/22/00 5:46 pm Gabriel wrote:

Makes me regret dropping out of high school... I just buy the text books and read them on my own, but I rarely get to discuss what I read with anyone.

I just saw two physics books on one of my book shelves but it's been about ten years since I've opened them. Makes me want to run home and get them :).

I wish there weren't such a "nerdy" stigma associated with science, because there are so many fundamentally interesting concepts that everyone could enjoy if they weren't so deterred by the bad associations. A lot of people, including myself, get turned off by the difficult mathematics involved, but conceptually a lot of the ideas can be appreciated with some patience and a bit of fudgery!

This not too related to physics, but one book I recommend strongly is "Godel, Escher, Bach" by Doug Hofstadler (I've been trying to get Dan to read it for a while). It is such a fascinating book about the notion that mathematics is fallible, in the sense that there are certain things that simply cannot be represented (computed, proven), through its structure. I think it won a Pulitzer, which is amazing for a techie book.
In response to Guy T.
(Oops, sorry, have to cut this short for now! Power outage... the UPS won't last long.)

Hey, that wouldn't be a bad thing to model into the CATs HQ... power outages; the security system there is so darned tight, it would be a welcome freedom =)
In response to Gabriel
On 6/22/00 5:46 pm Gabriel wrote:
Makes me regret dropping out of high school...

Makes me regret dropping out of high school and college. I always liked science and I would have taken more in high school but it just wasn't done. If you were smart a teacher would suggest you take physics or chemistry. No one told me I should, so I assumed I shouldn't. I was way too trusting of authority.

I'm not a big reader (though I will admit to finishing most of Carl Sagan's stuff)... I learned most of what I know from the TV, which I guess could be good or bad. If PBS can make history and science interesting where my books and teachers failed, so be it.

Speaking of... has anyone been watching that 1900 House special?

Z
In response to Gabriel
Makes me regret dropping out of high school... I just buy the text books and read them on my own, but I rarely get to discuss what I read with anyone.

So... everyone's saying I should stay in school, right? Well, I don't know... just because its mandatory to get at least a Grade 12 education in Canada doesn't mean I have to... oh, wait, yes it does =)

I'm sort of getting the best of both worlds as it is; I'm learning all that fancy schmancy physics and kinetics stuff while I'm also taking a couple of C++ programming courses (unfortunately, I think they'll probably go over a lot of stuff I already know, and I might get totally bored). So, as far as I can tell, I think I'm doing okay education-wise... I can become a programmer (like I really want to) or I can fall on a back-up job of a ballistics expert... (though I might have to work at McDonald's to, you-know, attain a scientific background ;-)

Yep, I'm not worried; I'll be programming, somewhere down the line; if I'm not working for some big corporation, then I'll be selling stuff out of my basement... all I know is that I'll be programming... because I love programming.

Nerd: n. 1. Stereotypical, often derogatory, phrase concerning a socially-inept person, who is often heavily-involved with computers and programming. 2. Jeremy Gibson. [my picture here]

=)
In response to Tom H.
This not too related to physics, but one book I recommend strongly is "Godel, Escher, Bach" by Doug Hofstadler (I've been trying to get Dan to read it for a while). It is such a fascinating book about the notion that mathematics is fallible, in the sense that there are certain things that simply cannot be represented (computed, proven), through its structure. I think it won a Pulitzer, which is amazing for a techie book.

I read it a while back and enjoyed it immensely, though there were a few parts I kind of skimmed. :) The idea of things that are unprovable within a system is interesting from a metaphysical standpoint, too; for example, in our universe, logic alone can't really prove the existence of God, in the same way that an artifically-intelligent NPC mob in a BYOND world could never prove that some beer-bellied dude from Ohio actually created his universe. (Sure, I could create /turf/burningBush or something to give him a clue, but rationally speaking it still wouldn't *prove* it to him... although he'd be a fool to risk incurring my wrath! :)

Hofstadter has another book called Metamagical Themas (I think) which has some interesting discussion about why certain typefaces elicit emotional responses. For example, Dom Casual immediately makes me think "cheesy," Caslon Antique makes me think "old and a little eerie," and Palatino makes me think "classy"... but why? Does the effect result from the contexts in which they've been used? I don't believe so--I think, rather, that they have some kind of intrinsic quality that leads graphic designers to choose them in appropriate situations.

The book even gives special attention to my all-time favorite, Helvetica... though I got a little tired of it for a year or so, when it seemed like it was about the only decent font included on the Macs at school and it was used for everything. :)
In response to Spuzzum
Yep, I'm not worried; I'll be programming, somewhere down the line; if I'm not working for some big corporation, then I'll be selling stuff out of my basement... all I know is that I'll be programming... because I love programming.

Nerd: n. 1. Stereotypical, often derogatory, phrase concerning a socially-inept person, who is often heavily-involved with computers and programming. 2. Jeremy Gibson. [my picture here]

You know, as far as I can tell, in the past few years it's become almost cool to be a programmer. Now I'm reluctant to tell people what I do, not because I'm afraid of seeming geeky, but because I don't want to sound boastful. :)
In response to Guy T.
You know, as far as I can tell, in the past few years it's become almost cool to be a programmer. Now I'm reluctant to tell people what I do, not because I'm afraid of seeming geeky, but because I don't want to sound boastful. :)

Though I might sound a little conceited, I feel as though I know a TON more than just about anyone online... obviously, that's a misconception, what with those truly social-outcast warez owners and hackers who have nothing better to do besides stare at source code for hours to make it easier for people to do illegal things, and with those big kahoonas like Microsoft (though I think I could teach them a thing or two about bug fixing ;-). Still, I often just sit back and smile at what I've done, and think long and hard what I want to do in the future... ah, I'm so glad to still be only 15 =)

I mostly program for fun right now. Most people want to program because they're trying to learn how so they can get a job, but all I care about is having fun doing it, and if I get a nice well-paying job out of it, well, that's a bonus then ;-)

Makes me kind of happy to be a "nerd"; not to mention I'm not like the other computer nuts, where I could safely beat someone up if they start trying to pick on me (not that I would... but I could =)...
In response to Spuzzum
Though I might sound a little conceited, I feel as though I know a TON more than just about anyone online... obviously, that's a misconception, what with those truly social-outcast warez owners and hackers who have nothing better to do besides stare at source code for hours to make it easier for people to do illegal things,


Actually, from what I've read, it sounds like a lot of the warez d00dz and hax0rs aren't all that bright (who'd have guessed it from their spelling?). Often when you see a news article about a big company suffering from malicious intruders, the perpetrators are "script kiddies" who basically just use premade programs and cycle them through automatic strategies until something works.

Of course, there are some pretty clever villains out there too. But I guess it makes sense that we'd rarely hear about the clever ones... :)


I mostly program for fun right now. Most people want to program because they're trying to learn how so they can get a job, but all I care about is having fun doing it, and if I get a nice well-paying job out of it, well, that's a bonus then ;-)


If it weren't for video games, I probably would have gone into English teaching and actually done some good for the world. :)

Business programming can be quite interesting if you have a decent employer who forces... er, encourages you to learn new things. And since DM is object-oriented, it's a perfect way to practice; I believe the two most popular business languages now are C++ and Java, both OO.


Makes me kind of happy to be a "nerd"; not to mention I'm not like the other computer nuts, where I could safely beat someone up if they start trying to pick on me (not that I would... but I could =)...

You'll have to find me first... NERD! :)
In response to Guy T.
Business programming can be quite interesting if you have a decent employer who forces... er, encourages you to learn new things. And since DM is object-oriented, it's a perfect way to practice; I believe the two most popular business languages now are C++ and Java, both OO.

I was always wondering how the heck people could get anything done WITHOUT an OO language... I mean, come on! If you can't use actual instances, why the heck do you even try?! =)
In response to Spuzzum
On 6/23/00 2:42 pm Spuzzum wrote:
I was always wondering how the heck people could get anything done WITHOUT an OO language... I mean, come on! If you can't use actual instances, why the heck do you even try?! =)

I learned BASIC way back when, and I liked it, I mean I still do. Sometimes I wish I could still program things in that.

I prefer "geek" to "nerd." I wonder if there's a reason.

Z
In response to Zilal
On 6/23/00 10:21 pm Zilal wrote:
On 6/23/00 2:42 pm Spuzzum wrote:
I was always wondering how the heck people could get anything done WITHOUT an OO language... I mean, come on! If you can't use actual instances, why the heck do you even try?! =)

I learned BASIC way back when, and I liked it, I mean I still do. Sometimes I wish I could still program things in that.

Yup. There's no sense in restricting yourself to one programming methodology. I find the OO approach to be quite elegant, but at times cumbersome in its basic premise that everything must be categorized. For instance, I've always felt that the usage of "friend" functions in C++ was designed as a hack to get around some fundamental problems of OO. But there is no sense starting a war over notation. Make code readable and consistent. In the end, the computer doesn't care!

In response to Tom H.
On 6/24/00 12:31 am Tom H. wrote:
Yup. There's no sense in restricting yourself to one programming methodology. I find the OO approach to be quite elegant, but at times cumbersome in its basic premise that everything must be categorized. For instance, I've always felt that the usage of "friend" functions in C++ was designed as a hack to get around some fundamental problems of OO. But there is no sense starting a war over notation. Make code readable and consistent. In the end, the computer doesn't care!


No, the friend function was a hack to get around C++ being badly designed. C++ really has nothing to do with object-oriented languages, and it would be wrong to make any assumptions based on it.

Having used real OO for many years now (I used to work at NeXT and now work at Apple on Cocoa), I have never found a problem that OO couldn't solve elegantly. You just have to learn not to take the philosophy part of it too seriously, as I did originally.

It's not important whether you are treating things in the perfect object-oriented way, it's important that it gets the job done. If you find an approach that gets the job done but doesn't use the perfect class hierarchy, don't worry about it.

It also really helps if you avoid subclassing as much as possible and use delegation, also. But that's another post, and only if anyone cares...
In response to Deadron
It also really helps if you avoid subclassing as much as possible and use delegation, also. But that's another post, and only if anyone cares...

If you've got the time, I've got the attention. :)