April 1, 2024
Below is the product of around 20 minutes of keyboard tapping.Okay, I haven't written here or in my public notes in a long time. Little, dumb Hamidah was off in the trenches, the peaks, the troughs, the crests, the void of rumination, embarrassment, contentment, the depths of despair, my beautiful-turned-ugly-then-turned-beautiful-in-a-homely-kind-of-way dorm, and now little, dumb Hamidah is back to the internet (or so I assume if this post actually makes it on the internet).
I wanted to write something. At least, I think I do, or at least I think I did. Last year, I wrote quite a bit. I had a lot more time on my hands and not enough outlets, but now I have more outlets and fewer words to say. Funny how that works.
I'm not really sure what to write, but as you know, if you clicked on this post, rambling is my thing. So, here are some select topical updates.
In the winter, I went to San Francisco...again; that time around, I liked it quite a bit. But there is some dissonance happening here. If I listen to the whispers of my heart, I would know that I cannot imagine myself living there. I was in New York during the summer. The only times I've been were visiting family in Rockaway, which is not that exciting. But wow, a city as an extension of who I am and want to be? What a concept. For me, both of these things are unclear, but NYC is so confusing that we kind of vibe. Should we go on a second date? I'm invested.
I was in D.C. for the EV unconference. Was fun. I met a young philosopher. I met some other cool people. If you're reading this, then I'm definitely referring to you :) Overall, it was fun. The city was very beautiful. Cherry blossoms are gorgeous. I'm excited for the future, people are working on cool things.
So, I watched the podcast with Dwarkesh, Sholto, and Trenton; it was great. My browser was open on one side of my screen, and I learned a lot. I had selective hearing because, with too much investment, I would have uprooted my entire but nascent career trajectory. Sholto mentioning how having dynamic compute and infinite context could allow for domain specializations probably stuck out the most, and, although it's a stretch, it feels the most relevant to the research I've been doing and will continue doing.
I've been looking at data provenance with the use-case of what it looks like to have models be reproducible, to the extent they aren't already. I can go down the rabbit hole of the value of this, but it makes me think about ML reproducibility as a concept. It's broken down into these nice abstractions where we have bitwise-exact, execution-exact and same recipe reproducibility. However, Seltzer and others agree that bit-wise exact is the easiest to discretize, and the other categories are discretized by association. However, this is very much a continuum of reproducibility, right(?), so what is the value of measuring this? Like, I take this model, I run it with all the fine-grained provenance tools (that I develop with the research team, hehe), and how close do we get? It doesn't make sense because you either get the same floating point or you don't; some transformations take your original code, and the operations change, but you get the same floating point; did you get bit-wise exact reproducibility? Systems and philosophy. The convergence. I've been looking at this sort of thing since the fall, but more so in the winter, and I've learned a lot.
I have a lot of ideas, and I've been running and seeing what happens. I've sent so many emails that my future self can squirm at, but I've had a lot of surprising responses and great conversations.
I'm preparing for the summer. Or rather, I'm not whatsoever, or you know what, I am a little bit. Whatever. I'll be in Vancouver this summer to work on an AWS team! I visited once in the spring, and it was beautiful. I have this vendetta for the West Coast, but I think I'll have a good time. Time to pull out my summer drip. I should move somewhere with warmer weather so I can wear the better half of my closet more often.
Okay, I think that's all.