Thursday March 10, 2016 | Rob Williams' Blog Random musings on Java development in the 21st Century... |
|
It would seem only logical that a major advantage of a community is access to criticism. Ok, yeah, that‘s a bad word. Let‘s back up. To perspective. Wait, what‘s the difference between those things? If someone has a different perspective on something that we have done, we immediately think that they:
Nevermind what you think of Plato, it‘s kind of important to note that the Dialogue was the foundation of Western Philosophy. Oh, wait, that‘s also a weak argument, I keep using these words that we all know are of little value. I got into an exchange about the Date/Time API today in a Kotlin forum. I made the statement that after all the time, while Joda and now JSR-310 is certainly night and day better than the horrible date/time stuff that was in Java originally, I was not really thrilled with the result. My main reason? Because I still find every time I return to it I have to use the docs to figure out the simplest things. That‘s an immediate #fail in my book. What happened next was interesting. One person came on and agreed. But then 2 people came on to say that Date/Time is hard and Stephen kept at it a long time, and yada yada. My retort was that I was not criticizing his effort, his intentions, his diligence, just saying that you want to become conversant in a language. Eventually, I stopped talking. As I was ruminating afterwards about the pointlessness of saying anything about anything, I couldn‘t help but notice a bunch of ironies.
Why? This is the open source world‘s primary taboo: though shalt not criticize the donator of time.
One of the super sick ironies of this is that in 20 years of being a supposedly community managed language, Date/Time is one of the only cases where Java admitted failure and allowed a replacement. Of course, it took pretty much the full 20 years to get it, and most people are still not using 8 (how to remove it from the Mac is still my most popular question on Stack Overflow). What is the difference between this kind of community adjudication and say X.500? judging now from 2 decades out, considering results: not much. Quickest way Swift could shoot itself in the head and guarantee failure? Adopt this approach. Doesn‘t that sound like I am saying they should not have an open means of criticism? No. Because the communities that think they have that almost never do. Swift has shown it will respond to criticism. It is incorporating feedback, as exactly that: feedback. Someone who can listen to critics, understand what they are saying and respond has become a unicorn. The counter life of the open source saint is the long suffering often lone millstone laborer who dragged his cross up the hill. The saddest part of the story to me is the time vector. Anyone who doesn‘t think it would be rather trivial to come up with a better Date/Time API in 1/100th of that time given literally a handful of resources and perhaps a pool of people to try their product and provide feedback, is living in a fantasy land. Any consideration of open source opprobrium in a matter that falls under the Java rubric I am going to have to just call complete bloody bullshit on. I pity all the people who made huge contributions only to have their work scooped up and made a support system for a new Software Serfdom. ( Mar 10 2016, 02:19:07 PM PST ) Permalink Comments [0]Envisioning the Post App Store World Whenever the topic comes up of when the retrospective gaze can call its assessments ‘history‘ you generally have arguments that run along the lines of ‘you need some distance to have some perspective.‘ Well data is great that way, sometimes it provides more perspective than 20 years could, and its conclusions are undeniable. With the politicians on the Democrat side trying to say that the issue of our day is income inequality, can we historically step in yet and say the app economy has failed? I am ready. And I have been a pretty active participant. I‘m also ready to say that its failure echoes the issues that are at the core of income inequality, making its purveyors cousins of the rapacious vampires that invented the logic that gave us ‘what‘s the first thing you should do after the bankers drove the world economy off a cliff? ensure their christmas bonuses or they will quit.‘ I didn‘t see The Wolf of Wall Street but I did see the scene where Matthew McConaughey walks the little noob through understanding that the bedrock of their lavish existence is the extraction of fees. In case that's not convincing enough (I could understand that given that it's coming from the same guy who's put his awesome acting talent to waxing poetic about driving a Lincoln), let this guy take a swipe: The Retirement Gamble by FrontlineHis mathematical redux of retirement is ‘you take 100% of the risk and if there are any proceeds you get to keep 30%.‘ The interviewer thinking he‘s complete senile is my favorite part, when in fact, the problem is the interviewer‘s inability to understand 3rd grade math. I know you can see what I am saying. Now, if we were to look at the 2 store purveyors, of course we know what their position is: their stores have been a huge success! Apple‘s presentation of that at their tech conferences makes you want to throw up your mouth a little. Well, when Sketch left the store, people started to wake up to what was already pretty apparent. The store model is a great success for the store owner. And the owner doesn‘t care so much if you as the goods provider are not having fun hopping through their nonsense.
Allow me a moment of philosophical historicizing on this topic before moving on. I happened to read Nietzsche‘s article ‘On the Use and Abuse of History for Everyday Living‘ again maybe a year or so ago and it kind of took over my thinking in ways that only Nietzsche can. At the core of it is an argument that there is no present (sorry buddhists) that we are forced to constantly construct our own fake present in every moment, from patchy interpretations of the past, using even patchier projects about the future. But the article goes way beyond this observation. One of the things it lays out is the idea that very few people can live in the present. After trying it and having their knuckles upbraided, most retreat into one or another cocoon. This has made me look around a lot and realize that our modern world is obsessed with encamping in the position I call ‘after the event.‘ My girlfriend and I were talking one day about the health care system and she was saying ‘well it‘s like socialism,‘ and I was saying ‘no, what we have is far worse, at least in socialism the government participates in the event, here, we have a government who just mans a croupier stick downstream and rearranges the chip piles.‘ Sadly, the app stores are shrines to this thinking. That fees are the best thing to build a business on (sure has worked for the banks as people have used fewer and fewer of their ‘services‘). But you really know you are lost when, because the fees are going well, you conclude that the products are going well. That‘s like eating fast food for a month and saying ‘well I still had bowel movements!‘ Uh, right. Will keeping Sketch up-to-date even feel like a ghetto, alt experience? I am guessing not. I am going to say between Apple and Google, I give Apple the greater odds of potentially righting their vampiric colonialization project, mainly because Apple is still also living in and before the event: it makes apps, and serious real ones. While Google still releases lab rats and takes a good 80% of them behind the barn for a bullet in their brains. One imagines that somewhere in the upper echelons of Apple‘s management, someone should read a few Shakespeare plays and perhaps represent the Fool in the next board meeting. Before the bankers/serfs monoculture degrades into the Ancien Regime defending their stash of weapons. ( Jan 22 2016, 06:56:00 AM PST ) Permalink Comments [0]Build: the VM Nightmare No One Can Wake From So we all kind of knew that Java‘s rapid rise to hegemony was mostly predicated on the promise of APIs and implementations for everything you could ever need to do. Of course, to make that possible, we needed to open up our notion of how resources were assembled into built products. Twenty years later, it‘s safe to say the appropriate theme song would be the Bowie/Eno classic from Low: ‘Always Crashing in the Same Car.‘ Ant, Maven, Gradle. I am obsessed with how Gradle sounds like cradle, which is kind of not what you want to invoke, and grateful. Like you are going to be a swaddled baby. Rescued from the din of nasty builds. But alas, it turned into just another flavor of what‘s wrong with all its predecessors. What do you call it when Evolution fails? is it atavism?
It makes me laugh hysterically that Hillary is running as a Progressive. It‘s like a 30 year veteran of Waterfall beaming as he spits out ‘Agile.‘ It is really super hard to be progressive. So much about the tech world is progressive, but then there are also so many parts of it that are deeply regressive, recessive, or even worse, addled and manic. You can almost feel the anxiety in the tool chain. Apple‘s tools are not perfect, and any day I am doing a search in the build settings, it‘s with the full consciousness that I am on a toe path over a very dark abyss. Underneath is a soaked and rotted C substrate that is impenetrable. If the 7 +/- 2 rule were to be applied to build, we would rapidly conclude that even the simplest stupid project requires so many return visits to grok the depths of its configured state that a part of your brain refuses to ever regard it as anything but unknown. That said, I prefer Apple‘s attempt to just handle build. I have Alcatraz now, which seems pretty promising. (If I had all the time back I spent fiddling with making internal jars available to Maven builds, I could probably build a vacation house out of toothpicks.) Apple should encourage this stuff, and really make modularity a priority, especially now that they have a language that is as or more capable than prior vehicles. A lot of times in tech, the progressive is a movement in the margins, that allows itself to skirt the main issues and simply eke out less annoying interfaces to what is basically the same dysfunctional core. Does build eventually collapse like the Structured Programming model did? or did it already but we are allowed to just ignore it? Build is a flat wasteland of unidimensional applications of the same techniques. Or maybe build isn‘t even the site of the failure, it‘s maybe just one of the rash sites, and the real failure is the utter inability to deliver any modularity, which stretches the scope of our forensic investigation back another couple decades. The second episode of Ken Burns‘ Cancer: The Emperor of All Maladies has an extended segment about radical mastectomy. It was invented at the turn of the 20th C, and it was not until the late 70s when a journalist was diagnosed, that someone questioned whether there was any data making the case for its efficacy. Took almost another decade to prove it did not outperform Lumpectomies. In case for it, 75 years earlier, rested on a single presumption: that cancer clusters around a center (the tumor). Maybe Cedric is treating the rash, but perhaps doing so will also encourage others to take the same instrument(s) and go deeper. I look forward to using his tool. One thing is for sure: the latest Cambrian bloom of build crap in the JS world shows that despite a lot of necessity, there is not a sane build anywhere. Kobalt: Build Management in Kotlin ( Nov 05 2015, 10:15:19 PM PST ) Permalink Comments [0]I came across the book recently ‘The Life-Changing Magic of Tidying Up.‘ I think I got a sample on the Play Store and right away thought ‘wow, this woman is an amazing writing voice.‘ Now that I have read a bunch of the book I know why: 1. it‘s a screed, delivered politely, like a witches curse delivered in a doll‘s voice, 2. it is DEEPLY philosophical. Nietzsche said he loves only that which was written in blood. Bio-graphy should be a screed, if you lived right. What‘s so awesome about this is that the author became obsessed with the problems of clutter at the age of 5. What‘s so awesome about the book is whether you are even immediately convinced by each turn she takes or not, her rhetorical force is brilliant. (The etymology of the word graph is from the greek word for writing.)
The first thoughts that appeared to me is that there are obviously positional dynamics in coding: a programmer who is doing a greenfield piece that is in an area of interest is thousands of times more likely to produce something that is ordered and has some organizational dimension to it than someone sent down into an abandoned mine to scrape the cruft off of ossified garbage. But also, we‘ve all seen a million times how the village elders position themselves in the antechamber and make their manicured little stubs, then leave it to underlings to actually hook those up to services that pump real blood, in other words: delivering tidy organs by foregoing the actual process of starting the heart. (I think when I finish this I have to go through Frankenstein. Cause, really folks, that‘s what we are doing most the time. Animation in software development is always a separate conjuring after the parts have been ‘assembled,‘ usually with a Perto‘s Law 80% of them passing through the clipboard.) But then the other thing I started to realize in considering the crossovers was that we live in a fundamental illusion about tidiness that can be summed up quite easily: since we are doing software, and things are still changing, we really can‘t clean up now, but we certainly mean to, later. (In other words, imagine I am now speaking in that voice from The Wall that‘s saying ‘how do expect to get any pudding if you don‘t eat your meat?,‘ ‘how can I clean this all up when I am still using all these toys? I can‘t put them away yet?‘) Orderly Parts Yet Endless Design Variation Shockingly, I kind of agree with this part. Well, that‘s not to say that we should allow ourselves to just surrender to mess making. No, that we should rather face it down and make it part of our process to achieve some semblance of order. Ms. Hondo‘s illusion free definition is actually a world in which the joyful things are brought forth and then sent back in ways that are defined. Of course, you can imagine how this would map to cooking. For instance, when I have a Caesar (nearly ubiquitous item that is an abysmal fake 90% of the time), I can tell immediately whether the garlic paste was rendered in a mortar and pestle. Caesar Dressing from a bottle is a complete joke. If I failed a blind test of detecting which was fresh v. bottled in a five round death match, I would probably accept hemlock after judgment. Anyway, after making Caesars a bunch, of course I learned the optimal way to make the paste, efficiently. Does that mean I have drained the joy from it and turned it into an exercise of lifeless rote tasks? No. But let‘s remember, the preparation of a proper Caesar is going to consist of rendering a lot of components, especially if you make the croutons fresh, and grate the cheese and spin the romaine, and on and on. Cooking is a strange case because you don‘t want to robotify it, but you also cannot get anywhere if you just keep accepting hop along cassidy witless piece by piece construction devoid of efficiency. I saw a biography recently of Madhur Jaffrey where she waxes on how core spices, even if there are 3 or 4, are like a painter‘s palette, that can mean the same dish can come out subtle different every time it is made. This is so true. The patterns movement tried to theorize about how we could do this: learn component rendering variations that are internally ordered, and consistent and purposeful, but can, in combination, open up an infinity of possibilities. If I look inside a pattern, the orderliness of it is kind of not really an issue: a Visitor either is one or it is not. If it is, I know where the two sides of the double dispatch are. That is not creativity crushing militarism. It‘s a realization that things have to be designed to both work internally in clear and reproducible (and consistent) ways, and they have to afford for combination with other elements. Open to extension, closed to changes. Maybe tidiness in code should be part of the ABC (always be closing) credo. If we are always closing, in a sense, we are putting each thing away. The surrender position is not so much a gutless fob to Heraclitus‘ statement that we can never step into the same river twice, it‘s really an inability to work in parts. We all know that codebases often fill up with god objects and weak abstractions. In part because the work is focused on the two ends of the spectrum: the infinite and the infinitesimal. In the middle is this realm where someone has to be able to say: ‘ok, there might be more changes coming to this, but for now, we can embody what is needed functionally, in a valid way, and thus forego accepting mess on a vague promissory note that will either never be paid down, or whose cost to pay off will grow as the moment of inception disappears into the past.‘ There is for sure an element of tidiness in patterns that is really appealing. It‘s pretty common to come across code that in itself is trying very hard to appear pretty and orderly, but that is in fact just presenting polished pieces arrayed in a sky devoid of constellations. The Toyota Case shows us that rot is not some easily explained phenomenon as we were led to believe. Frankly the Rot Generative Theory is kind of a classic scapegoat scenario: the greater men, in their increased wisdom know how to do things, and the lessers don‘t. When these little pygmy also rans sneak in, usually through deception, they seed the code with their hobbled homunculus‘s. We must fight to keep out these vermin! To keep our code clean. Uh, yeah, no.. ( Oct 27 2015, 10:37:38 PM PDT ) Permalink Comments [0]Few weeks ago I mentioned that I thought Java was dead, and was surprised that it elicited any reaction whatsoever. Most people want to take a statement like that and immediately make it personal: you have an axe to grind thus you are electing yourself constable and coroner and issuing a decree. My argument was that it simply fails to satisfy any definition of being alive. And I don‘t necessarily mean alive in the Walter White sense (‘I felt…. alive….‘), but kind of.
Meantime, doing another new project with iOS, it is the standard-bearer at this point in time. Incremental compilation and great testing support make it the best way to do TDD in the history of development. Swift is just stupid. My favorite part of it is that there are not few occasions where you put time into figuring out half-assed stupidities, but really next to none. Some of this is just due to the inclusion of Optionals, and some to the fact that abstractions are mapped more cleanly. There are still some things that I think are needed: unit testing of CloudKit is the most pressing one. Doing Protocol-Oriented Programming has been pretty easy and clean for the most part, though, using a Protocol as a key in a Dictionary, which you would have thought would be trivial, turns out is not, for reasons that are not totally surprising. Apple needs more and better sample apps that demonstrate this that‘s for sure. There are, however, stupid amounts of interesting things being written about Swift. (If community activity of non-trivial engagement is a metric of life, again, Swift to Java looks like running bamboo v. the petrified forest.) Finally, having watched the NSOperation session 3x, there are a lot of quiet aspects to the revolution on the Swift side, that I think will creep up on others. Everyone who watched that session I know was excited by it. While I do think it is important, there are vectors of importance about it, probably the most important is it‘s a further integration of concurrency into normal programming, which, surprise, most concurrency priests see as backward mo, but for sure it is not. The other thing it is, though, is a pretty startlingly effective means of dealing with a common source of code bloat through a simple, conceptual model: the notion of a dependency. If your goal is to write code that will do many things concurrently in a way that is testable, and the code is readable, Swift 2 is without peer. The fact that Java has become overrun with priests, spouting on about best practices, and running around slinging their slide decks at conferences should be the best representation of the language‘s necrosis. It‘s an admission of the fact that the thing in itself is never really going to be what you need and want it to be. That‘s an easier position to stake out when you just keep pointing to your size. Once other organisms appear that are clearly growing and fixing things you never had the resolve to address, it starts to look like Subversion (being subverted) all over again. ( Oct 06 2015, 07:57:55 AM PDT ) Permalink Comments [11]Little More Rumination on WWDC The POTUS was in the hood today. My girlfriend and I made an innocent trip to Trader Joe‘s and Von‘s and upon departure from the latter, the street we came in on was closed. In my attempt to get around, we were stopped on the ramp right when the motorcade was coming through. Few hours later, I decided I would try another episode of Marc Maron‘s podcast. As I was going through my app, I saw that Farhad and his buddy concluded that there were no surprises at WWDC. Reminded me of the scene at the end of Herzog‘s Kaspar Hauser film where he tries to explain perspective‘s dependence upon experience. Pundits seriously make carpet salesmen look like particle physicists. My head exploded when I read that in part, because I had spent a LOT of time digesting the massive carcass and still felt like I was nowhere near done. Please, just STFU if your analysis consists of licking a finger and sticking it in the wind.
Way back in 2009 or something Amazon was selling for $45 and when people were asking me what stocks I liked that and Apple were my only response. I remember the usual reaction was like ‘Amazon? the online store? didn‘t that pop a long time ago?‘ Well, after their last quarter‘s first unveiling of cloud profits, the stock shot up to $450. This week I had to move one of my sites to S3. The path was so full of witless debris and stubbly-ingrown puss pockets it was ridiculous. Wait, this is the Golden Horde that‘s going to pick everyone up? Really? Meanwhile, I happened to watch the What‘s new in CloudKit session from WWDC, and slowly, a simple idea started to grow: don‘t count apple out. Then at a few points, I started to realize that this seedling was intersecting a couple other rant vectors that have been growing for some time: pricing and greed. I am pretty sure I have ranted about pricing on here before. The most ingenious part of the Apple presentation was the main thrust of it was this: ‘you‘re a developer: you make an app, we give you 10 gigs, you pay nothing until you have users, once you have them, some of what they do comes out of their quota, some out of yours, if you achieve a blend, you could scale to 5M users and never end up paying a cent.‘ Genius. Where the first foray into cloud was naive and bumbling, Apple‘s second round looks really bloody smart. The other CloudKit presentation started with a dude basically saying the difference between a great app and a profoundly broken one is in error handling. This was both great to hear and painful, since in genus one, the errors were often useless and sometimes nonexistent. But it also shows that somebody has digested the wrong way and is back to make sure the path to the right way is clearly illuminated. The other thing that was pretty awesome was the fact that Swift 2 brings greater safety to entity/store mapping. Combine this with the other tooling improvements: better tests and automation, and Apple‘s CloudKit could be the Trojan Horse no one saw coming. The other kind of quiet thunderclap that no one seemed to notice was that you can now access it from web apps. A year ago, I ruled it out as a backend possibility. This year, I tried hard to make an argument for not using it and could not. Which gets me to my next point: ok Toyota apparently can‘t do Lean, but for the rest of us out here, let me make an argument that perhaps will sound facile at first but might grow on you. Amazon is like the Hadoop of the cloud now: a big fat blubbering offering that is so vast that every time you have to do anything with it, you feel like Ralph Kamden facing another Civil Service test. That‘s a great strategy in the buildout phase of a technological epoch. It‘s a disastrous one if and when competitors arrive. Everyone knows that the Amazon Free Tier, which is the dealer/junky salvo of yore, is useless. You can‘t put anything worthwhile on a micro instance. You sure as hell can‘t do any analysis. Meanwhile, consider the Lean argument: you are making an app. You start with iOS. Reaching the next shelf in the cupboard, to get your data into a shared realm is seamless. I thought Google was going to get to this first. I tried their embedded Cloud functionality in Android Studio over and over again. I finally did get it to work. But when you look at what you have to do to use App Engine, seriously, it looks like it‘s a few nostril hairs past where WebObjects was when NeXT first made it 2 decades ago. Simple bean mapping built on following conventions and not having much control over the rest. Consider also, it‘s still using pre-8 Java. So no lambdas, the old time/date junk, no Optionals. The takeaway from this side is the unknowing money guys are all thinking Amazon‘s the big bet for the cloud because they already have the Customers. Um, Customers can move providers pretty easily. We are for sure living in the age of the hermit crab in tech. When you start considering virtualization at the container level, it‘s laughable to think that motel occupancy is justification for a P/E of 2000 (or whatever Amazon is now). Ok it‘s only 1100. On our way to lunch the other day, somehow the subject turned to Git. I was making the point that though it swept through tech at a rate that made even the worst plagues look like rashes, it didn‘t really advance the main business of source control: conflict arbitration and resolution. It made it much less stressful, and gave you greater power in engaging it, but the state of the art of knowing how to merge files was not change in any way by Git. This is where I think Apple has another huge trump card. As the Hadoop disaster has illustrated, moving things around and doing work on them, is complicated and consumes a lot of the energy. In the Tips and Tricks session, they show an example of a Party with Attendees and they show how complicated it is to handle two mobile users saying they will attend the event. This makes me think Apple has woken up just in time, and they are focused on the right things. Let Amazon have the decaying, necrotic ‘enterprise‘ market (Larry Ellison was apparently gloating about how much Oracle stood to make on the cloud since commissions are only paid from the first of 10 years this week). They are trying to turn the cloud into everything we hate about the Telcos. While they are doing that, Apple can round up the apps of the future. When their opposition wakes up to the fact that they are all running on iCloud it will be too late. And revising their pricing chart won‘t stop them from sinking. Marc Maron on In Bed with Joan Rivers Part way through the WWDC Keynote, I was thinking we were in for another round of not much coming and a bunch of happy talking what amounts to tidying up. But then the bomb bay doors opened and Slim Pickens was barebacking some serious ordnance. The outer surprises were awesome, particularly the open sourcing of Swift, and some of the particulars of WatchKit 2. But things really got rolling when the language and tooling details started to come out. Before I go into why I think those are so huge, there are a few observations about the packaging. The movie they made about apps was spectacular. The one they made about the history of music was peg-legged and witless. Which, turns out, was a perfect capsulization of where we are as a species: we have one of the most beloved science popularizers calling apps a watershed moment in human history in the first half of the show, then in the second, a music presentation that was flat and uninspiring, that seemed to know how empty it was, resorting to stiff name-dropping celebrity schtick. Wait: a company that makes mediocre headphones but could DJ a new radio station no one I know wants is worth $3.2B and I still have people write me to ask what they got when they bought my premium app for $1.99. But I have no hate for Apple right now, as a matter of fact, I should issue a hyperbole warning. I think what I saw in WWDC 2015 is engineering I never thought I would live to see.
The most talked-about session on the Swift side was the one about Protocol-oriented Programming. I watched this session really closely. Part of the time I was thinking ‘who is this dude who thinks he can just brush aside 60 years of theory accumulated of some of humanity‘s greatest minds?‘ But then the other part was like ‘yeah, he could have just punted and let us all roll around in languages that make promises that they can‘t deliver.‘ Ok, allow me to float a concept that probably will not sit well with most: we have never really had a language in mainstream programming, that was born, developed, and seen through to its maturity. Maybe Scala could have been that language but it most assuredly is not. The bible of C states unequivocally that it was developed NOT as a general purpose language, then C++ had a great shot, but ended up trying to convince a bunch of comfortable crusty types and metrics show they failed, then Java, we found out, was both of the prior things in one: it was not meant to be a GP language, and it was, by its makers‘ admission, pulled out too early and blown up too fast. Frankly the Swift session, consciously or subconsciously, had a subtext of ‘we‘re here to finish the junk that Java and C++ were too gutless to iron out.‘ The whole session hinged on two things: programming by interface never worked because you don‘t have a way to reference the object. So they added the ability to have Self in the signature. That in turn, brings in a problem: what if the object does not support the parts you mean to reference? Solution #2: filtering of the object via a where clause. Did the session convince me that these two little changes will forever banish my feelings of logging and incompleteness from years of trying to do interface-based programming in Java? Not sure. I already have done a lot of extensions with Swift and found they did indeed aid in testing. So I am dying to try this stuff out. If you think about it, the whole phrase ‘object oriented programming‘ tells you that your capacity for abstraction is going to be limited because we are already below the layer of behavioral definition and dealing with instances (objects). (Good interview question is to try to engage a candidate in a discussion of the different between object and class concepts, as it pertains, for instance, to things like state replication [Prototype Pattern].) Hillary made a joke yesterday that she‘s not the youngest candidate but if she‘s elected she will be the youngest woman. I would kind of say the same thing here on the Protocol front: maybe Apple did not neatly sew this up yet and put a bow on it, we shall see, but in a way it doesn‘t matter because they are the first ones to really try it. The other things that were super dope: UI Testing – OMG, finally, and looks a billion times better than the Selenium bastard imp stuff still crawling around the Java world.. 10. While I am calling this a knockout round for Apple, and saying the load this year was if not the best Apple has ever done up there with the best, there is one last perspective to consider. Google‘s development partner JetBrains also makes the Process tool our team uses (YouTrack) and they just released a code review tool (Upsource). We use Slack with these 2 and we wrote our own plugin to integrate them. Totally transformed the way we work. Apple should really consider acquiring a tool and embracing process automation. I said this ages ago, but they should have made iMessages Server (included on the mini) into a Slack killer before there was a Slack. (Slack is worth something like $8B now and a team of 10 great programmers could have a competitor ready in a few months, frankly, they just rewrote their Android app as a native app and it‘s embarrassingly bad.) I have said this before but I really believe both parties have to figure out how to support their application developer communities. The iWatch is a pretty awesome piece of gear but every single review I have seen has concluded that it will live/die by the apps that appear for it in the next 12 months. Sad fact: most people making those apps will not make any money. So they are helping the first original product in 3 years succeed for a company with a $200B cash pile on the hope that consumers will buy their product, while said company doles out billions to try and convince people to listen to their music stream. Really? Offer prizes, invite people to meet with the team, help them promote, do something. Telling us how much money has been paid out is meaningless unto cynical. That‘s not to say I don‘t think that is an important metric and milestone, I am just saying it‘s kind of like a Mayor touting toll road collections in a reelection bid. Live up to the app movie you made. Apple‘s engineering and tools teams deserve a HUGE congratulations after this round. ( Jun 14 2015, 02:52:06 PM PDT ) Permalink Comments [0]Bit more on G v. A / Mobile 2015 Ok so maybe I was a bit grumpy, like the kid who thought he was getting a new race kit and woke up instead to a coloring book. Now is not completely useless, and no question, voice assistive tech is better overall on Android. But I stand by the essence of that point: the next major battlefield is context-oriented computing and Google is acting like they think they‘re sitting on a large lead in the bottom of the 9th. I still see it as a small lead say in the 3rd inning: a little rain and the whole thing could be off. Jesus, Soundhound is entering this space? Isn‘t that like finding out that a sow farmer is preparing en entrant for the Kentucky Derby? Ok, that‘s a bit of a stretch.. On the other side, as much as peace has been restored in Xcode, had kind of a weird week where as I was packaging a release with a WatchKit app and a Today Extension, I was thinking ‘wait, modular, TDD nirvana is just a shot away, but having a framework include a framework doesn‘t work? wait, um, really??‘
Speaking of which, this article about Toyota‘s codebase dropped like a thermonuclear device. I have to do a separate, extended rant about it. Actually thinking about it I realized one of the main reasons I would like to do a podcast is the environment is so target rich it‘s ludicrous. The main redux I offer on this is welcome to a world in which no engineer ever has been paid as much in a year as a final season friends actor made for one episode. Welcome to the world of your car and plane might crash but that‘s less important than another 800 hours of the Kardashians being recorded for posterity. Remember: Lean, the most popular Agile variant, is derived from the Toyota System. But whenever something bad turns up, the Agile priesthood uncorks the classic bad apple speech from Strangelove. In an industry whose preferred path to known-hood is homily delivery, this is a huge black eye. But here is the key puzzle to consider: how does one of the most successful companies in the world get found out to have a case or rot that makes the House of Usher look like Buckingham Palace? I predict this will do them immense damage as a brand. This is a golden age for development. Both Android and iOS are moving at a pace in which a year compares favorably with decades of prior tech hegemons. Looking forward to ios9, but wow, Soundhound and their jacked up mule joining the derby is frankly awesome. ( Jun 02 2015, 07:33:50 PM PDT ) Permalink Comments [0]Round One in the Developer War 2015 My scorecard had the mojo solidly on the Android side after last year‘s I/O: 2 solid releases, Android had actually become more stable than iOS, and finally the design had also leapfrogged: 7 and 8 (iOS) were pretty forgettable and L was a monster. Guess I was kind of surprised at what a flat, horribly-produced affair the 2015 I/O was. Seriously, whoever produced that show should have been cashiered before it was even over. It made the Samsung events look like the Carson Era Tonight Show (actually the S6 rollout was pretty good). Anyway, because I want to be helpful, let me start by providing a couple notes:
Huge fail. Oh, on the tools side, the big news was they sped up gradle and the NDK guys finally yelled enough, Google went back to their tool partner and got C++ support. The image that emerged from the presentation in general:
Now that it does work, it‘s pretty much of a monster. Google made a big mistake not going all in on Kotlin. Having done Java back to the year of its birth, I am not a hater at all but it‘s no match for Swift. I was thinking about something funny about Apple this past week: in some ways, Steve Jobs was the embodiment of the Mike character in Breaking Bad (and now Better Call Saul) shows: no half measures. So maybe this horrible experience showed that though there was pain, the result was worth it. Though, it‘s hard to call it because in a lot of ways Swift was a half measure: it let‘s you continue to do O-C if you like, and mix them. But unlike C++, it does NOT allow you to just write O-C with the compiler (which to me was C++ ultimate downfall: metrics studies showed that a decade on, only 5% of the C++ codebases even had C++ language constructs in them). Should do a whole post on why I think that once you have used Swift or Kotlin, returning to Java or Ruby or Python will feel like going back to a diaper. Of course, let me add the peremptory IMHO here, but I am predicting that the language landscape is changing and that the vanguard languages will end up ushering in a new era which will make the older ones seem obsolete. Sure, maybe Java 10 can try to turn itself into something like Kotlin or Swift, but what‘s the point? Bottom line on this front: advantage Apple. While they were mopping up the SK mess, they also put in incremental build. Sadly, it‘s not working perfectly on test targets, but that is going to get fixed. I went nuts and finally removed all my source files from the test target, which with Swift is actually quite easy, but the results were pretty stunning. Now working in TDD fashion is pretty much the most painless environment around: rerunning tests after minor changes to the source is stupid fast. I think until you have a situation like that, you will always have people complaining about tests. Once you have it, you will not want to do any serious work outside a test. My prediction is that Google has blown it‘s chance to really make a dent in the high end market with this whimpy release, perhaps for another 5 years. I listened to a podcast yesterday where Farhad Manjoo and some other cat were talking about Android and Farhad was making the point that no one is making any money in the Android camp. The other dude seriously, I was starting to think, was 5 years old (or that‘s when his brain stopped developing). His main argument was ‘android is a success because it‘s on the most devices, and because the goal was to prevent microsoft and apple from controlling the future.‘ Really?? Actually I thought even Farhad‘s argument was while at least coherent and adult, well short of what it should have been. Just claiming victory based on body bag counts is the ultimate stupidity. I have another main vector in my argument about why I think Google has screwed itself, and that is that Now is just not that useful. Sure it can perform some interesting tricks, but my main phone has been Android now for a year and it‘s day-to-day, minute-to-minute impact on my life is nearly zero. This morning it told me how long it would take for me to get to Whole Foods. I was there yesterday, IDIOT!!! If you are going to tell me you don‘t need any help you have everything already figured out, please try not to interrupt me with your stupidity!! ( May 31 2015, 09:11:14 AM PDT ) Permalink Comments [0]A Few Thoughts on Reactive Programming The word fashion is usually used, outside its own world, derogatorily. Hipsters have brought contempt for fashion to a new level. If you think about it, Hipsters, who, while claiming to hold fashion in contempt, have aroused the ire of people for whom the attack is not a defense of fashion, or mediocrity or popularity (the things Hipsters supposedly abhor), but rather a generalized aggression against tastemaking of any sort, but especially tastemaking whose only drive is to avoid being common. In that I find myself siding with the antihipsters, because the whole process of people banding together to adjudicate taste is absurd, and while I am happy to cotton absurdity, and illogic, alogic is a form of mental lice. At any given time, in any profession, there is always some percentage of the participants who are prone to these types of organizational hysteria. Sadly, in programming, this arc seems to be ascending. Every year we hear about some other tribe who thinks that they‘ve found a burbling spring that will spit out bugfree code and all the craftsmen who trade in their tools for the new creed‘s will skip and whistle their way through work each day. Rails was the kookiest and thankfully that seems to have run its course. But now, alas, reactive is turning into the latest thing. There are a lot of great ideas behind Reactive Programming and if it could be kept pure, it might even be a good hitching post, but it‘s looking instead like it‘s going to be running down the rabbit hole even faster than many of its forebears. Why? Well, as was the case with Spring more than a decade ago, it contains a lot of good ideas, but once logos becomes praxis, those ideas start to serve a new master. And sadly in the case of reactive, those masters appear to have a very narrow point of view. Is concurrency important? of course it is. Will it be important when/if you have to write a mobile app? probably minimally so. Even if the mobile app is sharing data with other users. Then synchronization will be more important, and there are so few frameworks that have done a good job of making that easy. Apple‘s first attempt with CoreData syncing through iCloud was a huge, unmitigated disaster. My favorite part of the long I/O keynote last year was when the speaker pointed out the developer audience that WhatsApp got all the way to a $19B acquisition without a single backend programmer. (I happened to have seen the Nova episode ‘Decoding Neanderthals‘ this week so excuse me if I am sounding Darwinian.) But frankly, from another perspective, I am just really filing a complaint (something I have been told I am expert at). My complaint, not unlike Jay Z‘s, is that the grease is going to the wrong wheels.
How prepared are we, how much grease has the context wheel gotten? Almost none. I bought my first book on context-oriented programming a decade and a half ago. I started thinking about it again because I was working on a watch app. You have a TINY window to communicate through and what you show there MUST be context-sensitive, dependent, adapted. In WatchKit, you have a Glance view in the project (one) and it‘s a storyboard (a==one). The presumption is that you will simply show and hide different groups of things as needed for the various contexts that you want to support. So supposed you want a daytime and night time view. Yeah, your ‘storyboard‘ is going to be a useless jumble of programming clutter that communicates nothing. Had they even given us the ability to define multiple glances and then add some logic as to which one to show, I would feel like I at least got a diaper. But no. Apple‘s not alone in this. The fact that Google Now is still not open is such a travesty. If I were to assemble my own anti-dynastic theses, that might make it to #1. At least Apple will let anyone provide a Today extension. Today I was reading a bit about context-oriented programming and found a great scholarly article that talks about using Actors. So I thought ‘I wonder if a soul from the Reactive group has ever mentioned COP (Context-Oriented Programming)...‘ If they have, Google doesn‘t know about it… ( May 10 2015, 11:43:05 AM PDT ) Permalink Comments [1]Granted, the first successful mainstream language to have exception handling did a crap job with them, but the fact that two of the newest languages completely eschew exceptions seriously makes you question whether evolution really has it right, or if Jimi Hendrix‘s fable in 1983 is closer to reality: we evolve to a the point of spoilage then we retreat. Whenever I ran into someone who thought Kernighan and Richie was a classic, I would argue that it was an ugly lie. Because it sold you the dream of terse, elegance, typified in the single line file copy (the crescendo of their masterpiece) (while (dest++ = source++) [];), but of course that is fake code because it doesn‘t have all the error handling you will need to survive in production. Once that is put in, it will start looking like the WIndows source code that made a generation of programmers who got a look at it think about working at a big box retailer instead.
Or you could go buy a copy of Steve McConnell‘s book Code Complete and learn how to program defensively. Basically boils down to ‘put in the stuff that occurs to you, but don‘t wear yourself out, you can‘t think of everything, so when crap fails, you can just add new defenses.‘ Yeah think about that. Granted, the other extreme, the Maginot Wall, is equally stupid, but sheesh. To me, one of the great chapters in the best book on Object Oriented programming, Bertrand Meyer‘s Object Oriented Software Construction is the one on exceptions, which does two things:
Meyer basically says look: you are not going to traverse layers properly and even if you did, what is the point of taking garbage all the way down to the core only to have to send the obvious ‘message‘ back to the top? How did exceptions just get dropped? There are a couple projects that bolt them onto the side of Swift. But the published books on Swift illuminate their preferred path and jesus it is ugly. It‘s the once glowing city slurped down into a fiery pit of retro goofiness. Go is not much better. The Go manual makes you think you are being treated to error handling caviar, but its major innovation was that you don‘t assign the results of operations. Uh, ok, thank you. That‘s kind of like being told ‘we aren‘t going to kill you but we are going to bind your feet and feed you a diet of bugs and dirt.‘ But then maybe this is for the better: the actor guys, specifically the Akka team, make a pretty compelling argument that error propagation in source code is never really going to work very well. There is syntactic taint, ugly little things you can hold your nose through, and then there are structural abominations that are so pervasively unsettling that it makes you wonder how you would go on. Having written a lot of Swift in 2014 at first I thought I didn‘t miss exceptions much. But now that I am adapting some of that code, I am changing my mind. I was going to write a post about objects in general and a short exchange on Twitter with the great Jim Coplien this week, advancing the argument that most theories of change, from Hegel to Kuhn, were too simplistic, but that surely objects showed that in technology the new way, once it has finally been assimilated, becomes the old way with curtains. That‘s pretty much what the Go passage shows in spades: ‘ok, remember this horrid vestige from your forgotten criminal past? yeah you are back here, eating it again, but we doused it with some powdered sugar for ya.‘ ( Jan 25 2015, 09:45:55 PM PST ) Permalink Comments [3]Ladies and Gentleman: the Language Grafters In the spirit of Serial, the remarkable podcast, I am going to start pursuing threads through multiple posts here. There were a lot of unexploded bombs left in the rubble of my Java 8 post. I want to pickup one of them in this post: what happens when languages are glued together with either prior incarnations of themselves (Java 8/7/5/4) or are bolted onto existing ones (Swift/Objective C). Just so happens, I did a bunch of Java 8 programming, and a bunch of Swift in the last few months. Two things the prior post failed to suss out adequately: 1. why in Java‘s case, the grafting is particularly pernicious, and 2. why in the case of Swift, I would return a different verdict. One of the things that I grew up thinking made me a logical sort and every year makes me feel more like a nut is I like to consider outcomes when evaluating how well something works.
The Producer/Consumer relationship in coding is a source of endless wonder. I hold to the belief that a framework fails if it allows itself to be used improperly. Sounds like a kind of stupid commandment that is about control. It‘s not. It‘s about making sure that the path is clear so you don‘t just end up with a wilderness in which a million wanderers lay down completely unrelated paths. When I first started working with Cocoa, I thought ‘this is kind of a tinkertoy, didSelectRowAtIndexPath, etc. But after a while, you realize that Cocoa is one of the best frameworks ever because once you learn it, you can write fairly clean, uniform code, and you can read the code of others. Languages cannot prove themselves on their own, we should have learned that by now. The C++/Java Watershed was really not about language. C++ spent almost a decade and there were pretty much no APIs/SDKs, other than the STL. Java went to the other extreme and decided to try to paper the whole waterfront. Now, we have competing walled gardens. But they look pretty good efficiency-wise, when compared to the dog‘s dinner/rambling wreck of movements that go from growth to bloat but can‘t help themselves. When you look at Swift vs. Java, the first possible way to see their differences is that Swift is an aggressive example of focusing on offense, while Java 8 is the lumbering wonk who, in the face of hipster criticism, finally agrees to grow a beard. I was pretty happy with the Modern Objective C changes. When Swift landed, I thought ‘what? someone got a whiff of glue instead of aromatherapy.‘ But using it is kind of magic. Another post, but the early takes on it were so stupid, so off-base, so glaringly imbecilic it was hilarious. The most refreshing thing about Swift is its an attempt to kind of utter the last word on language design. Does that sound nutty? I hope so. But really, hipsterism, in Nietzschean terms, is by definition built on self-delusion. We keep having the dream of the new language that makes coding an uninterrupted, skipping and whistling joy, through some new features none of the oldsters thought of. Shouldn‘t language eventually, in Nash terms, tend toward an equilibrium? Meaning, isn‘t there a period in which feature discovery and experiment gives way to a settling of combinations and certain accepted consensus? Yes. There truly are not new ideas, and there is a LOT of consensus. Swift doesn‘t make the mistake of favoring polemic over power. Using it doesn‘t feel like you are having some hipster show you a new way. At some point, as you go back and forth through the (excellent) book, you think ‘this is a kitchen sink, but one that is put together, well.‘ There is some trash talking in the advanced session at WWDC. Some of it is deserved, like when it is pointed out that the generics are real and don‘t use Type Erasure (like Java), but some of it is hyperbolic and stupid, of the ‘ok, what I‘m about to tell you is going to blow your mind, ....pow…...‘ variety. Overall, though, on a good Swift day you are thinking ‘this feels like cheating,‘ and on a great Swift day, you think ‘this is the best language that ever was.‘ (That said, for a really whack twist here, I have flipped and prefer Android now, next time…) The response amongst programmers was gleeful at first, followed by not even skepticism or day after remorse, but just kind of whininess. Some of it I agree with, it can get tiring having to change code that was compiling, to suit some syntactic shift. But, most of the time, the improvements feel like adequate recompense. The only thing I miss, and am still not sure about, is the absence of Exception Handling. That‘s a post in itself, but for now, it certainly is not enough to turn the verdict for me. Back to question number one of this post. First off, though Oracle has been selling Java 8 as the biggest batch of new since 5, it‘s really not a new language, and it took too long and came in backpedaling. That said, I like Java 8 a lot. Swift, on the other hand, is poised to do what probably no other language ever has: to transport a population of coders from one language to another. It‘s not without pain, but unlike C++, it also isn‘t going to result in a scene of a few elders declaring victory while trying to hide the fact that only a tiny portion of the population took their ferry, having chosen instead to just stay behind. ( Nov 25 2014, 11:01:18 PM PST ) Permalink Comments [0]Like the rest of the world, I fell deeply in love with Slack. Why? Lot of reasons, but the main one is it provides a new means of really building the SDLC process around communication rather than tools. Which do you think is more common these days, the team of disorganized nitwits meandering around in the wilderness unable to execute despite a brilliant idea and much talent? or the team of hopped up agilists preening with every card event, congratulating themselves endlessly as they hurtle forward on a trip to nowhere? Of course it‘s the latter. I saw some would be philosopher doling out his agile advice. First totemic incantation: focus on the process, not the product. Great advice. What these people are really saying is ‘I don‘t know if what I am doing is ever going to amount to anything, so damn it, I‘m going to feel good about myself for having ticked off the tickets that were assigned to me.‘ There‘s a certain fascism to it: this sense that the horrible fear of chaos has spawned a great desire to clean up the world by installing good old fashioned Order.
So I found in the YouTrack API that to find out what had happened, I had to query an RSS URL. Yeah, that‘s super stupid, and yes, that‘s yet another post. I thought to myself ‘wow, ok, in general, stream processing gets ugly really fast, this should be fun to do this with Streams and Lambdas!‘ Now, I can look back on that shiny faced nube (my prior self) anticipating battle like the turkey about to get stuffed into the machine in that Sarah Palin video. At first, my code was kind of looking pretty nice, but two things jumped out and caused a certain amount of cognitive dissonance:
Eventually, I ended up reverting my code and crawling back to 7 because you see, there really is no way to write good code in a language if the stuff you are having to use is not using that language. You get something that looks new and decent, then you have to go stick it onto the rear end of an ass, and the result is a deformed satyr. Furthermore, each time you do a new piece and get a moment of enjoyment, going back taints it. I remember when 5 dropped, pretty instantly, pre-5 code had the smell of death on it that made touching it almost unimaginable. What is the proper reaction to this? Oh, of course, we waited 5 years for this release, but if you thought the whole set of APIs in the JDK would get ported (and arguably, the XML stream processing code is not even core), you are being ridiculous (you being me). Or maybe just that as Grisby says in Lady from Shanghia ‘it‘s a bright, guilty world.‘ In other words, even under the glare of klieg lights, the dusty shadow of sin taints all, so give up the dream of purity. Not sure I‘ve made up my mind, but I do know, hearing 8 promoted as ‘it was time for Java to have lambdas because all the cool kids are doing them,‘ rang in my ears making me think we have reached the Pyrrhic stage of late empire, where everything is just a gesture, a wiggling of the toe to prove that the giant has still beaten death… ( Nov 19 2014, 10:57:33 PM PST ) Permalink Comments [1]Revisiting Bloch's Static Builder with IntelliJ's New Postfix Completion Started putting some blog posts on Medium (that no one reads). They have a great editing system there. Pretty thrilled with the new postfix completion in IntelliJ 13.1 EAP. Redid my Builder Live Templates to use it and the results are pretty awesome. See the video of it here Wow, Textism is so broken, I forgot how much fun it was trying to suss out the right mixture of voodoo to get the page to layout! Seriously, give me a damned telegraph tapper. (I guess this is an example of the anonymous broken crap Jony Ive was talking about.. ) In the medium piece, I talked about the absurdity of programmers making their ease in construction their primary concern when dispatching their duties, especially when so few of them can type. Then I saw a talk off youtube this morning (beamed to my Chromecast from my Nexus 5) where by the end, I was having fantasies of the presenter having his fingers closed in a drawer like the scene in Kiss Me Deadly. Every bloody single thing that he typed he had to retype. It was utter torture, like watching Uncle Fester try to perform ballet. By the end of that presentation, I wanted to go find a comments section and put: ‘I have two words for the presenter: Mavis Beacon.‘ ( Mar 18 2014, 05:04:58 PM PDT ) Permalink Comments [1]Open Descent into Balkan Gloom Watched a couple sessions on Java 8 today. One of them was Brian Goetz, and it was pretty awesome. It ended with him saying lambdas were something all the other cool kids were already doing and that Java would have them come hell or high water by the end of next year (immediately telling me the session I was listening to was a year old and we were a few days shy of either hell or high water). While I am pretty excited about Java 8, when you dip into it and look at it a bit, part of your brain lights up and says ‘OMG, finally…‘ But another part makes you wonder ‘wow, this isn‘t just late, it‘s just a tad too impressed with itself, for what it is.‘ Blocks came to Objective C with what seemed to me at least to be very little fanfare. Sure there was some hyperbole, but they were part of a surge that included a lot of stuff. Java 8 is being hailed as the biggest release ever and it has Lambdas and Streams. Oh, and the fix for the Date and Time crap. By O-C/Cocoa standards, it would be a disappointing one year release, but it‘s a 4 year release.
That said, when you look at lambdas a bunch, what you start to see are some of the same unsettling features that plagued prior Java innovations: anemic examples repeated ad nauseum, producing the feeling that either the features were designed in an echo chamber (‘wow that new sort w/out the Comparator and type specs is pretty spiffy!‘) or the job of making sure what was specced was going to be enough. Actually, the JavaOne sessions were pretty cool and the examples are good, they just tend to leave 2 impressions: 1. syntactic sugar that sounds like ‘less typing for you junior!,‘ and 2. the new mongo 4 arg construct that lets you map and filter the internal iterator and collect the results. While I think that looks good, it‘s not like it‘s that mind-blowing linguistically. Now, that said, the ability to compose chains of actions that are then scheduled without waits is nice for sure, but my finally redux on this part: this has nothing on Akka. Ultimately, I think Java 8 will take the wind out of Scala‘s sails, but will probably strengthen Akka‘s position. Isn‘t it funny though how the Open realm runs around screaming about how horrible walled gardens are, and yet, finds itself admitting that it took way too long to feature match and it‘s still running to just top off it‘s tank after a decade since the last fill up? I actually think things are looking pretty good in Javaland these days, but open has surely not succeeded in making closed look bad: Apple‘s performance has dwarfed it and 8 will not change that. Finally, it‘s kind of pathetic too that despite how long 8 has been in the hopper and how late it is now, I don‘t see much uptake in the ‘open‘ community. IntelliJ has some decent support for it. But the Typesafe guys don‘t have a single mention of it, let alone some doc about how Akka will change its Java interface to support it. Why scream about how awful closed stuff is if on the open side, instead we‘re going to get little warring tribes that ignore each other and don‘t really support the supposed backbone of their mission? the idea of a common, agreed upon base? Today I was thinking that maybe the dependency injection stuff of the last decade was just a massive girdle. What do lambdas and dependency injection have to do with each other? Actually quite a lot. DI is kind of the last stop in the ‘I need my shared mutable state right freaking now‘ phase of schizoid delusion. We ran out of the procedural burning building because the passing of everything needed was finally untenable, but DI is a weird, lazy parallel crackup. Also, where you would think you would die without DI in Objective C, you do fine, usually with delegates instead, and snippets of code stuck into various joints using blocks. When everyone just decamps to have their own little different versions of the same thing, the process of evolving and questioning the thing in itself just kind of stops. Sadly, Java has become largely reactive (this time I mean philosophically). It‘s the hardest thing in the world to go from reactive back to being creative. It might not have to, but if it does at some point, I‘m betting it‘s doomed. ( Dec 10 2013, 10:18:35 PM PST ) Permalink Comments [1] |
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||