jay: (Default)
jay ([personal profile] jay) wrote2002-09-25 01:01 am

Back on the road...

I'm about to head off to a meeting with FAA and NASA folks near Washington, DC tomorrow... just one night, then back in order to minimize the time spent away from [livejournal.com profile] patgreene in her currently-recovering state.

Otherwise... talked with Jeff S. about pattern-matching in time-series data... came up on the fly with a new approach. He promised me a footnote in his next paper ;-). A meeting with my boss was amiable, but led to more assignments for next week (a half-day review that I need to organize by next Tuesday). Our car was ticked by local police for not having current stickers on its license plates (dispute with the motor vehicles dept.). And I spent an hour in a 3rd-grade classroom this afternoon, discussing petrified wood and helping to dissect a cactus.

On the road again

[identity profile] p3aches.livejournal.com 2002-09-25 01:30 am (UTC)(link)
Wow you travel for this job. Do you at least get freqyent flyer miles? Tonight my dance card got a space on it. Cheers T

Re: On the road again

[identity profile] brian1789.livejournal.com 2002-09-27 03:14 am (UTC)(link)
About 15 trips a year, on average... maybe 75K FF miles, plus bonus miles. Since last year I get to keep them all :-).

(concerned look) hope that nothing too-unpleasant happened that opened that space (hug)
geekchick: (geekchick)

[personal profile] geekchick 2002-09-25 07:22 am (UTC)(link)
talked with Jeff S. about pattern-matching in time-series data... came up on the fly with a new approach.

You know how I love it when you talk like that. ;)

[identity profile] daltong.livejournal.com 2002-09-25 11:17 am (UTC)(link)
I'm interested in the pattern-matching approach. Describe?

[identity profile] brian1789.livejournal.com 2002-09-27 03:15 am (UTC)(link)
Looking at time series... if we non-dimensionalize and then try to compare them, looking for given patterns or for similarities that can be used to clasify them, how can we actually compare them? My thought was to discretize at different granularities (big slices, medium, and small slices) and then re-scan at each one, looking for patterns. Like taking music and trying to find a given three-note sequence, by first scanning it as a series of whole notes, then redoing it as all quarter notes, then sixteenths, etc., looking for the same three-note pattern at each level of detail. This would be used in classifying time-series Doppler-shifted astronomy data... seems reasonably simple to me.