| |
Articles from
August 2007
There's a lot flying around the web about the document format debate. In one corner, we have Sun pushing Open Document Format (ODF) and in the other we have Microsoft pushing Open XML . If that wasn't enough, the de facto standard, the old binary Word DOC format, is nipping at both competitors' heels with interop demands coming from every direction. Of course, interop is more about tools than file formats, but apparently that line isn't as cleanly drawn as we might like. As you might imagine, this was a top priority for Microsoft as the Open XML format was designed, so migration between formats (at least in Office) is 100%. ODF on the other hand is a whole other story; and this sounds like more than just a tooling issue.
Sun is a key driver of the ODF specification and even oversees the committee that approves the spec. Gee, I don't see any conflict of interest there. I'm not the only one. The Open Document Foundation's founding president, Gary Edwards, recently noted that Sun is opposed to adding any features not already implemented in Open Office , the core of Sun's own Star Office. Gee, that doesn't sound controlling. No "standard" should be limited to one single application's boundaries when so many others exist and provide a greater deal of functionality. Therein lies the key problem with ODF and the main reason Microsoft had to create Open XML: ODF doesn't support any mechanism for extensions. Maybe its creators didn't fully understand the problem they needed to solve, but I'd argue that extensibility should be an absolute must-have on any wide-reaching solution; especially a standard! This is the key differentiator between the 2 formats. Simple enough to fix, right? Apparently, not. Of course, it's looking more like Sun push-back than a technical blockage. The Foundation pushed a plug-in proposal, but without Sun's buy-in, it won't be going anywhere. According to Edwards, "Sun has successfully blocked or otherwise neutralized all efforts to improve ODF interop with Microsoft documents." Are you starting to see the pieces fit together, too?
Perhaps the best option lies within another of Edwards' comments: "What's really needed is a standards process not controlled by big vendors with big applications and big market share appetites." I couldn't agree with this more. It's obvious Sun is trying to take advantage of their position and try to wedge Microsoft out of the game. Unfortunately for them, this could backfire. As the de facto standard, anything less than full support may be detrimental to adoption. The real truth is that Sun isn't only slowing adoption with their choices, but also limiting functionality. I see this as the biggest problem with ODF and Sun's stance. However, the need for a truly independent standards body is much bigger. Until that happens, I'll have to put my support behind the more functional Open XML standard, which seems to be more focused on what end users truly need -- and not just what Microsoft thinks, which is what ODF is about. Think about it. What would you choose?
For everyone out there who is a shortcut junkie, I've got just what you've been looking for: KeyXL.com  . I started a shortcut reference a while back and am now thinking of removing it. KeyXL has a ton of apps, so I have a good feeling you'll find exactly what you're looking for. Enjoy!
Jim Rapoza mentions something that should probably be in the back of all our minds: the probability of the Web 2.0 bubble burst . He mentions a few well thought-out observations about previous bubbles. I, for one, am looking forward to this burst. Part of my reasoning behind that is the fact that, at its peak, the bubble becomes more about marketing than the elegance of the technology in question. After a burst, things even out and it's back to business as usual, except we then have a technology we can implement without the overzealous polluting the waters... well, at least not as many of them. Beyond all of this, tho, I'm excited about the potential of Web 2.0 getting back to it's roots. Too many people think Web 2.0 is about a dynamic, AJAX experience when, in fact, it's about a more consumable, "semantic" experience. The closest we've been to this original dream is the abundance of service-orientation around technologies such as RSS and Atom. You'll see this in intriguing mash-ups such as those created from services like Pipes and Popfly. One of those recent mashups centered around the Windows Live family of services is Tafiti, a Silverlight-based search utility.
This burst couldn't come any sooner, but I'm afraid we probably have a bit longer to deal with this bubble. While the burst is inevitable, I don't expect to see a significant change until late 2008. Of course, by that time, we'll be seeing the beginnings of the next big thing. I just wonder what that will be...
I mentioned I might have an opportunity to get Simon Guest and company to work on a user experience prototype for a project I'm working on. He came by last week to talk to a few people and the response was very good, which I was glad to see. Of course, that doesn't mean it'll happen. There's still the security issues that need to be worked out, since its a government customer. This honestly shouldn't be a problem since neither the data nor the code is what Simon's team has an interest in, but such is the life...
In one of the meetings we had, Simon made a comment that I thought was very interesting. One guy was talking about having Simon's team do the user experience prototype on a webpart. Simon basically responded stating that deciding what to show in a webpart is more about user interface than user experience -- downloading and using the latest and greatest controls doesn't necessarily mean you're delivering a great user experience. Thinking about it, this made a lot of sense. Of course, I'm probably bastardizing the whole thing. What I pretty much got out of it was that the difference between user experience and user interface, in part, lies within the context of that interface and its presentation to the user. User interface is a subset of user experience in that a user interface defines what the user can see and/or do. User experience is more about how the user gets the job done; specifically the ease of use aspect of that interaction.
This concept of UI vs. UX pretty much strikes right into the heart of the idea that throwing up pretty pictures equates to a good user experience. Not to knock graphic design's importance, but interaction design is a must-have when it comes to improving user experience.
I know there's supposed to be a new Windows Live release coming up pretty soon, but apparently someone felt the need to push out some of those Hotmail changes early . With that release come a few things I'm pretty happy about: the ability to skip the Today screen , accept/decline Outlook meeting requests, and more storage (now 5GB). I noticed the changes this past weekend, but thought it might've been a fluke of some sort. I guess not. Of course, one of the features I'm really looking forward to -- the ability to switch between accounts -- is still yet to be released. I will be a happy man when this one is added. I don't know if the release date has been announced yet, so I'm going to keep that to myself, but let's just say we're not too far off.
I just have to gripe about this. For those of you who haven't been following the Netflix-Blockbuster competition, Blockbuster is losing... well, that's my opinion, anyway. Blockbuster tried it and it wasn't too successful, so they had to scale their plans back, which made them cheaper, if I remember correctly. Now, from what I gather, Netflix is trying to get a feel for how it might deal with lowering prices .Of course, they're only doing this for select individuals using their 2- or 3-out plan. This just aggravates me. Why are they rewarding people who use two of their smallest plans? Most companies try to persuade customers to upgrade by showing how they can save money on the larger plans. Netflix isn't (and has never been) that smart. In fact, they've got you at a sweet spot of 3-out for $5.67 per disc ($5.33/ea with the price drop). All plans with 4 or more out are $6/ea.
Someone at Netflix needs to get a clue. Logically, you should see a small price drop in every level, as you move up. That's what entices users to upgrade. Maybe that's the problem... the logic. Of course, I'm just bitter.
Apparently, Microsoft is now pushing for its XML Paper Specification (XPS) to get standardized by Ecma International . Is it just me or has Microsoft made Ecma? Don't get me wrong, the assocation was around long before Microsoft first approached it to standardize the C# language and Common Language Infrastructure (CLI) . Heck, Ecma actually dates back to the early '60s ; long before Microsoft came on the scene. I guess my point of view is in part due to what I'll call "narrow-sighted, Americanism," but that's a whole other topic for another day. Another part is probably because I'm fairly heavily focused on .NET, web development, and the standards that revolve around these areas.
I guess one thing I'm wondering is, how many standards organizations do we really need? Seriously. Off the top of my head, I know of 7 that affect the work I do: ANSI , Ecma, IEEE , IETF , ISO , OASIS , and W3C . At the time of this writing, Wikipedia lists 35 international standards organizations as well as a slew of regional and national organizations. Do we really need so many? As with most software that seems to be duplicated, I'm guessing these were each brought up on their own, individual need for standardization within their area, whether that be location- or field-based. I just can't help but think we have a bit too much redundancy. I admit, it is sometimes hard to submit to someone else's opinions about such matters as standardization, but how good is a "standard" if there are 100 of them? Of course, what good is a standard if it doesn't meet all needs? The truth is, nothing is 100%. We all know this. I'm just thinking it's probably about time we had a standard for standards bodies. I'd like to see standards bodies come to an agreement on who decides what can or can't be a standard. With that, I'd like to see some of these 7 organizations we hear about daily go away. I'm not going to say who I think should be merged with who, but someone should.
While I'm on the topic of "how many is too many," there's the obvious question of: How many standards do we really need? Bringing that home to what started this rant, XPS has one primary competitor today: Adobe's Portable Document Format (PDF) . We all know about and love to hate PDF... I'd like to stress the "love to hate" part, as PDF is the one document format I'll go out of my way to avoid. Then again, Foxit Software has made this much more bearable with its Foxit Reader and PDF Editor applications. The main benefit PDF has over other formats has been its read-only nature. You can publish a document as a PDF on the web and feel pretty safe about it not being re-published by third parties with customizations you didn't approve of. XPS has that same feel, but is much more open than PDF, which is why I like it. Of course, there's still that glaring question: What's the difference? So, what is the difference? Why do we need a second read-only document format? That's a good question...
By Michael Flanakin
@ 4:11 AM
:: 1834 Views
:: LSU
:: Digg it!
I'm definitely looking forward to a good season. LSU's topped only by the USC, but that doesn't mean a whole lot to me. USC seems to have it easy over there. With more than half of the SEC in the top 25  and 75% of the conference in the top 50, there's going to be some heated competition. Then again, when have you known it to be anything but in the SEC? USC's competition isn't looking too lively, as usual, with only 3 of the Pac10 in the top 25 and 60% of the conference in the top 50. Of course, these preseason numbers can be argued. We'll see as the season progresses. Life in the SEC is pretty much one game away from leading or following in the conference, so just about every game is key. There are bound to be a couple big upsets, like there are every year.
Late last year, I saw a Channel9 video on a tool being developed by Microsoft Research (MSR), FastDash , which is obviously an acronym for Fostering Awareness for Software Teams Dashboard. Actually, I'm not sure how much the acronym is still being used since the capitalization seems to have changed in recent usage, but whatever. The bottom line is, if you haven't seen the video, you should go check it out. There's another short spot with it and a tool called DynaVis , which is also in development by MSR, that you should check out as well. Heck, if you don't frequent Channel9 and you develop with Microsoft tools and technologies, you're missing out; but I digress... After seeing FastDash, I had to have it. Essentially, the tool gives you insight into what your dev team is doing. There's a huge potential for what this could be and I'd like to see it grow into something that's included in the Visual Studio tool set. I have no idea if/how that might happen, but, given my interest in research, I volunteered my time to the team. For the next release of FastDash, we're planning on changing how the tool functions a little. We'll be using another tool called PipeDream to create the UI. Now, this is something I'm still wrapping my head around, so forgive me if it sounds odd. PipeDream uses PowerShell cmdlets and what it calls vislets to generate output. An example of this might be to use a Get-CpuUsage cmdlet piped to a progress bar vislet to see how your CPU is being utilized.
So, I'm creating some PowerShell cmdlets to access TFS resources and something funky started to happen... When I call Get-TFWorkItemArea, I'm getting a list of the child areas for the first area. That's strange. After digging into it more, I realized it was actually giving all the child areas, but I only had children on one area node, so that's why I was just seeing the first one's children. Apparently -- and this still doesn't make complete sense to me -- the Cmdlet.WriteObject() method is displaying the object's child nodes when I tell it to enumerate a collection. Very strange. There are 2 ways to call the method, one is passing it an object and the other is passing it a collection and telling it to enumerate thru the collection. Being new, I assumed enumerating thru the collection was the best option, so that's what I went for. Of course, I was wrong. I still have no idea why this happened. I may dig into it one day, but for now, I'm just glad it's working again. I'd be curious to see how other people are using it.
A friend just asked me about the integration story between .NET and Cold Fusion the other day. It figures something like this would come out, but apparently Adobe is looking to enhance CF8  with the ability to access .NET objects. Back in the Macromedia days, this level of integration with Java was already added. Seeing the sharp decline of Java in the enterprise, it's no wonder Adobe's making this move. My only question is, how does this relate to the Dynamic Language Runtime (DLR)? I remember hearing something about Cold Fusion support when the DLR finally gets released, but I haven't followed it that much. It's probably no surprise, but I'm not a fan of dynamic languages. I'm sure I'll get flamed for this, but it seems more like a lazy wo/man's language. Beyond that, there seems is a greater potential for buggy software. I just don't understand the selling point. To each his/her own, tho. That's why .NET is so great -- we can make these language choices on our own, yet still integrate with the rest of the platform.
If you're not familiar with Simon Guest , he's on Microsoft's Architecture Strategy Team. While the name may not mean a whole lot to you, you'll probably recognize the Architecture Journal , of which Simon is the editor. If that's still not ringing a bell, I suggest you at least give it a look-see. A lot of devs find the concepts too abstract, so it's not for everyone. Each edition is very themed, so you'll usually have a good idea of how much you'll get out of it pretty early. Either way, that's not what this is all about... Simon started pushing user experience for architects about two years ago, if I remember correctly (probably not). I kind of latched onto this because I'm a huge proponent of user experience. Like most developers, I'm no designer, but I think I do have some artistic ability... at least moreso than most developers I've met. Of course, it's not all about graphic design; behavioral design (aka human-computer interaction or HCI) is actually the biggest part of user experience. Admittedly, I have a lot to learn in this arena. I'm finding out a lot of it has to do with merely thinking outside the box and trying to imagine simpler ways to get tasks done, but even that isn't as simple as it sounds.
I was lucky enough to see a presentation on user experience Simon put on at Microsoft's internal conference for those of us in the field, TechReady. His presentation was absolutely awesome. I left psyched about one thing: getting him and his team out to a project I'm working on. We have a number of systems that could seriously use some re-engineering on the UI front. Then again, what system doesn't? I don't know if it's going to work out or not -- there are a lot of factors that come into play. Nonetheless, I'm hopeful. I'd seriously enjoy the opportunity to leach off the process Simon and his team use.
If you have an opportunity to see Simon's user experience presentation, I highly suggest you take advantage of that. He recently posted the slides from his recent appearance at the San Diego UX Summit, so that's a start; but I have to let you know you won't get the same experience. Simon's presentation is something you need to see in person.
Have you ever used Virtual PC on a machine that just didn't want to allow the window to resize as expected? I have. As a matter of fact, I've got three laptops I'm on regularly as well as two desktops and one of them just has to be the bastard of the group. I maximize the window and the area taken up by the VM isn't the entire screen. It's kind of an odd thing to describe. I wish I could take a screenshot of it, but when the window is maximized, the host OS can't be accessed. Well, not that I know of, anyway. So, if my guest OS is set to 640x480, the host resolution stays at the original value, 1680x1050 in my case (on a widescreen monitor), and the rest of the screen outside of the 640x480 is black. I tried playing with the guest screen resolution, but could never get it to work as it has so many times on other machines. As a matter of fact, it only allowed me to select standard resolutions and not even widescreen resolutions, which only added to my aggravation. Luckily, someone clued me in on a little secret: Virtual PC only supports a host OS resolution of 1400x whatever. In my case, that's 1440x900. But... why didn't resizing my host OS work!? Bah... stupid VPC... You have to completely shut down VPC and restart it for it to recognize and accept the change. From there on, you should be golden.
Umm... You gotta be kidding me, right? Why isn't this mentioned anywhere? VPC should tell me of this problem and try to remedy it on its own. How? Easy. Why not display a warning sign on the VPC Console? Give me a button or setting I can use to automatically set my resolution to 1400x when I maximize a VM, but go ahead and switch it back when I'm not maximized. Sure, there'd be a slight delay in maximizing the window, but if I know what's going on, I'm fine with that.
Windows Live Folders has officially been renamed to SkyDrive. With this, the look and feel has been updated along with a few other changes to enhance the user experience. I'm very glad to see this upgrade, but there's much more to go. I'm waiting for further integration into other WL services and my Windows desktop... a la FolderShare, I'm hoping.
If you've been wondering what'll be in the next release of Team Foundation Server, Brian Harry's  done a great job of summing up the final feature list for TFS 2008  in a fairly concise list of bullets. Brian's been doing an outstanding job of evangelizing TFS both internally and externally. If you're interested in its future, this is definately a man to watch!
By Michael Flanakin
@ 7:33 AM
:: 1972 Views
:: .NET
:: Digg it!
If you haven't kept up on .NET 3.5 like a good little developer, then you're probably only sparsely familiar with what we have coming out this year. Luckily, you can catch up fairly quickly with a bit of history of LINQ , the most talked about of these new features. You'll also get a good understanding of how LINQ works under the covers, which is somewhat interesting. Of course, not everything is covered in this article, so there's still more you should read. I just thought this was some interesting insight into how some of the features came about. I still question the value of some of the features, but all-in-all, I think it's turned out well so far. We'll see how it goes throughout the year, when Visual Studio "Rosario" gets rolled out with, what I'm assuming will be an updated version of .NET and the languages. Most likely, it'll only be minor performance improvements and other tweaking, but that's yet to be seen.
|
|
|