Looking at the world of media: from music to RIA.

Quick update…

October 25th, 2007 Posted in Fake Science, General Media / Stuff, max 2007 | No Comments »

Things have been crazy hectic for me over the last few weeks. The big news is that we have decided to close down the Fake Science store. For more details check out my post on the FS Blog about why we are closing the store. I am hoping that once this all settles down I will have more free time to do things like post and work on some examples. In fact, I am working on an article with some people right now that I can’t wait to talk about, but for now I can only tease you about it.

I still have a lot of notes from Max that I want to post, the one topic that I really want to dig into is the Flash Player internals presentation that I attended. Lots of great stuff and I want to sit down and get it up. I may try to later today.

I have heard rumor that sessions from Max were recorded and may become available. I hope they do, specifically for the Inspire Session that was presented by Michael Gough. Michael co-heads the Adobe XD design team and he did an amazing session talking about design challenges and goals for current and future web applications. Brilliant stuff and I want to spend more time thinking and talking about what he hit on.

Adobe Max: Adobe Hosted Services – Web APIs and Mashups

October 2nd, 2007 Posted in Adobe APIs, max 2007 | No Comments »

MAX 2007 LogoAdobe is now providing a document sharing tool, repository and public API’s, launched today. They are calling the service Adobe Share and you can try it out at: share.adobe.com

The Adobe Share application is a Flex based tool used for your repository management. You an add files, remove files, update files, manage user permissions and other basic functionality you would need.

With the new repository Adobe is also releasing a public API to allow developers to build custom tools around the repository. The API services provides hooks into the Share system to manage users, file tracking, file upload, usage tracking, etc.

The API is REST based using a token system that most services have been adopting. Developers request a token for their repository and then use it in the connection to identify the account. The key not only pairs you to the repository, it also prevents abuse and allows Adobe to provide service throttling.

Currently the accounts are throttled to 10,000 calls an hour and require a 500ms delay between calls from a client. Clients are tracked by session so the time is not tracked for the whole account, just the specific client. The 10k limit is set to the account.

One interesting feature that was hinted at during the session is the ability to have a MD5 hash as a unique key for your application. This key allows you to verify that the application you developed is the only one that can access your repository and helps provide a much more secure application. There are currently two SDK libraries available for Adobe share: Java and Flex. They didn’t mention if the SDK for Flex is truly Flex or if it is AS3. My impression is that it was Flex and leveraged HTTPService, but I can not confirm this.

Currently, media files are not supported (MP3, FLV, MOV) but DOC, PDF, ZIP are. One feature they are looking to add is the abiltiy to do document conversion so that a PDF can be converted to a SWF or potentially a Word document to PDF, etc.

Read the rest of this entry »

Adobe Max: General Session and notes…

October 2nd, 2007 Posted in Flash Player, Rich Internet Applications, max 2007 | No Comments »

MAX 2007 Logo(posted on day 2 due to lack of connection)

I am at Adobe MAX 2007 being held in Chicago and it was kicked off with the General Session hosted by Kevin Lynch. As with previous MAX’s the general session is part sneak peek and part highlights of the new technologies/applications that Adobe and their customers are creating.

I am not going to focus too much on the general session, but more post notes about the sessions that I am attending. I do want to mention a few things I learned at the general session but for now I be posting information I gleam from the sessions…

One of the cooler things I did see at the general session was a peek at the new Flash Player version 10, aka “Astro”. Two features they are focusing on are text managment/manipulation and effects. The new text features have much more granular control of text objects and full support for right to left languages. This is a huge update because one of the hardest things to work with in the Flash Player is textual content. In some of the projects I have worked on the solution was often to not let Flash handle the text and move the control back to the browser for displaying complex layout.

The other great thing about Astro is that they are rolling out a new language to allow custom filters and effects that can be applied to the Flash content. These filters are similar to the existing filters such as blur, drop shadow, etc. The language used is the same that drives AfterEffects effects and this will allow for much more high performance effects that are small and lightweight.

Internet stability has been spotty so I will try and post as I get steady connections. There may be multiple posts at a time depending on when and where my connections actually talk with the network.

I’ve been screwed by DRM

September 24th, 2007 Posted in Digital Rights Management, General Media / Stuff | No Comments »

Anti-DRMOver the last three and a half years I have been commenting on how bad Digital Rights Management (DRM) is. I have ranted on the Fake Science Lab Report, kvetched on the Fake Science Blog and finally I have been burnt first hand by DRM. In fact, I was screwed by DRM twice in one week, twice!!!

First off, let’s start with the most obvious DRM burn: iTunes video. A few months back I was going on a longish flight with my girlfriend back to Dallas to see the family, and I decided to fill up my new video iPod with a few seasons of South Park. Now, I knew that the video was DRM’d but I felt like being a good little citizen and actually pay for the content since I didn’t own it on DVD. I grabbed a few seasons (not cheap) and downloaded them. They worked great, birds sang, whoopdeefuckingdoo.

Last week, I was home and my TV was in the shop (this tidbit is important because it plays into part II). Wanting to watch something while I ate dinner I grabbed my iPod and hooked it up to my LCD monitor via my MacBookPro. iTunes fired up and I tried to play an episode, and what do you know it wants me to authenticate the machine, since I didn’t buy and download the videos from my laptop. Fine… I knew this was part of the process. I choose my AOL account and try log in. EHHHHNT… oh I am sorry, you hit a whammy. It appeared that my password for the AOL account is invalid, which is fucking impossible since I used the password to buy my content and log into AIM all the time. But no, it doesn’t want to let me in nor is there a way to get my password.

Fine, be that way, I will just create an Apple account and transfer my account over to it. EHHHHHNT… wammy number two. Seems that the credit card I used on my AOL account expired and my new card doesn’t match the data on record. Shit. Now what? Okay… customer service. I have to actually call AOL to figure it out. I waited ten minutes on the phone and navigated through a bajillion options before I finally freaked out and said enough with this fucking ass backward piece of shit experience. I am actually a valid, honorable customer who had no intention of piracy, I had paid good money to use this content and now I am locked out!??! This is retarded. Meanwhile, all those other people can just go bittorrent the show and bam there you have it. And you wonder why people hate DRM and run to Pirate Cove?

Round 2. As I mentioned, my TV was in the shop. Why, because DRM strikes again. It appears that the Samsung 46″ LCD HDTV I have has a little HDMI issue. Some devices can not handshake properly with the mainboard (two hours of testing cables and research on the web led me to this fact). See, the way HDMI and HDCP work is that both devices have to verify that they are legal partners and can actually share data. The idea is to prevent people from ripping the content from the pure digital stream. Because of a improper mainboard design my TV could not talk to my new Oppo DVD player. Luckily, Samsung has awesome support and they sent out a TV repair team to review and then take my TV to the shop. This was all covered under warranty too. From what I understand, the issue with the mainboard is due to the lack of robust standards for HDMI/HDCP before HDMI 1.3. (I could be wrong on this, I would love someone to fill in the details) All because of DRM, and you know the worst part… people are ripping the digital stream no problem.

This is what gets me so pissed off. Billions of dollars are spent on DRM to prevent theft. Yet the “thieves” get what they want and the people that actually play by the rules end up getting fucked over. Right now is a really, really bad time to be dealing with new technology. Will my TV work with this video player? Will my receiver handle the new format for HDMI when it gets updated? What happens when they change the standard and I want to upgrade to a new HD-DVD player? Will my TV even work with it? And I do a lot of research and actually know a bit about what is going on. Can you imagine the pain an average consumer has deal with?

In the past I could buy a TV and it would work with any new devices I decided to add to my entertainment center for at least ten years (my Toshiba lasted 12). Today, if I try to go out and get a decent TV it may not even last five years… its totally, totally ridiculous what is going on and all of this to do what? To try and protect crappy media that I don’t even really want anyway. Seriously, spend the money on making quality material so that I feel like my purchases are worth it. Or lower the price so its to the point where I don’t feel like I am being ripped off if I don’t like what I bought and I will buy it because I do want to support the people who have the creativity to make the content. Don’t make me jump through hoops to get there when they honestly don’t need to be there nor make me look at the price and feel that downloading it is way worth it because the cost is out of my range. Well, at least my TV is back in time for Halo 3…

LocalConnection: Keeping strong types across applications

September 14th, 2007 Posted in ActionScript, Flash Player, Flex Development, Rich Internet Applications | 1 Comment »

flashlogo_grey.jpgI have been working on a project that requires the use of LocalConnections (LC) and one of the things I found out was that by default the LC does not retain the Class type when communicating across the bridge. After doing a little reading, I found out that the LC uses the AMF protocol for communication and serializes the object down to a base object that is then deserialized when received by the listening app. Honestly I wasn’t that surprised that the Class type was lost and it was easy enough to ignore for what I was working on. The problem is that this kind of thing often gets under my skin and I end up having to find out more about how it works and how I can make it do what I want.

A while back I was reading Darron Schall’s blog about the using of the [Transient] metatag and how it can be used for serialization and deep object cloning. The thing that kept jabbing my mind is that he quoted the Flex docs and said that this works with any AMF system, including LocalConnection. The more I thought about it the more his post made me think that what I wanted was possible… I could retain Class typing across the LC bridge, but how? As with most problems I fired up a test project and what started out as a simple question turned into a Utility class that is way, way more robust then I ever expected to write in a single hack-a-thon.

Read the rest of this entry »

Flexunit Testing Structures

September 9th, 2007 Posted in Flex Development, Rich Internet Applications | No Comments »

flex_logo.jpgI recently started a new Flex project and one of the goals was to really leverage Flexunit for creating automated unit tests for the core Framework that is being designed. My intention with using Flexunit (and any unit testing strategy) is two fold: One, the code I am working on at the moment is predominantly utilities and data processors that can’t really be run from the main system yet because the main system doesn’t fully exist. By creating unit test I can execute my code and verify that what I built actually does what I intend. Second, I want to create an extensive library of tests for the core Framework so that as the project proceeds the developers and QA have a basis for testing before a checking in code to make sure nothing was broken.

I am not going to delve into the details of Flexuint, there are plenty of starting points out there. What I want this post to focus on is how I structured the projects to provide a clean testing development process. To some of you this may be a “duh” article but for me it was like a light going off on how to create a clean separation of release code from test code. Read the rest of this entry »

Going to Adobe MAX 2007

September 2nd, 2007 Posted in Rich Internet Applications | No Comments »

MAX 2007 LogoWith a little nudging, okay a lot of nudging, from my co-worker Doug I decided to go to Adobe’s 2007 MAX developer conference this year. This will be my 3rd official MAX, my 5th Adobe/Macromedia convention. The only difference is this year I am going as an independent consultant and not as a member of the convention’s staff. It will be interesting to be on the attendee side of the world and of course the pain of having to cough up the cash to go (thank god for write-offs).

Its a pretty last minute choice since the conference kicks off on September 30th. I need to sit down and decide what presentations that I want to check out. I will also kick off some IM’s to my friends and ex-coworkers to see who is attending. No matter what this will be a great adventure and hopefully I can meet some interesting people doing interesting things in the web app space.

Resurrection of Vivisecting Media…

September 1st, 2007 Posted in Rich Internet Applications | No Comments »

Since the post last year most of my effort has been on the Fake Science Blog. One thing that I have been wanting for a long while is to blog on cutting-edge technology and the Rich Interactive Application (RIA) space since this what I do for my day job. I am planning (okay, more like hoping) that I will be returning to regular postings on this blog. Its all media in the end so I figure this is the perfect place to make my own notebook on development processes, findings on web technologies and other assorted goodness.

Well… there ya go. Enjoy boys and girls!

*yawn* oh, hello

December 19th, 2006 Posted in General Media / Stuff | No Comments »

I am back… kinda.  I do want to blog more but between my day job and Fake Science all my free cycles are pretty much sucked up.  For the moment, I want to throw out a few quick links to the few of you that read this thinger.

First off, over at Fake Science we launched our company blog.  The idea behind the FS blog is to run it as a group effort so that content is updated on a regular basis and to give everyone a clearer picture of what we are doing and were we are going with FS.

The other thing I want to share is a new dj mix I put together.  I finally got my new MacBook Pro and I finally have a stable environment again!  yeah!

The Scent of Burnt Electronics

September 19th, 2006 Posted in General Media / Stuff, Music, djing | No Comments »

The last 7 days have been… interesting.  Lots of good and lots of bad.  Wednesday I went to the False Profit Equity happy hour to play a nice downtempo set.  I had been working on the set for a while, making sure all the tracks flowed and the progression was structured.  I honestly was really excited to drop this new music… but all that changed in a blink of an eye.

When I arrived at the event I was handed a pint and some cheese-less pizza (which is a good thing in my mind, all though I do miss dairy sometimes) and I caught up with some friends I have not seen in ages.  After things were settled down I set up my laptop rig: an Alienware 12" laptop, MOTU Traveler firewire interface and an X-Session USB controller.  Everything was running fine when I place my rig on the decks and grabbed the input RCA cable.  As I plugged the cable in I heard a loud pop and I just assumed the mixer level was up, but it was not.

No, what happened still boggles my mind.  Somehow, the act of plugging in the audio caused a short in my system and managed to fry both the laptop and the MOTU interface, and I mean fried.  You could smell the burnt chips in the air.  At that point it all came crashing down… I thought I had just lost all my data.  I tried to reboot the machine and it wouldn’t even post.

After about 3 attempts of rebooting, plugging, unplugging, removing the battery, etc. the machine posts and I actually get windows running.  I was stoked to have my data still and hardware can always be replaced.  Unfortunately all was not as well as it seemed.  When I got home the machine never booted again.  I called support and they walked me through a few simple things but as soon as I mentioned the smell of burnt circuits the issued a FedEx tag so I can ship it to them.

In the meantime I bought an external enclosure for the drive (thanks Steve for the advice) and I was successfully able to recover all my data last night.  So that is a huge, huge weight off my shoulders.  Interestingly enough, when I opened the panel in the laptop you can see all the fire damage on the Mobo where the Firewire jack is located, and yes the irony of the name of the jack is apparent to me.

Powered by Qumana