Monday, April 12, 2010

0
Build It Better: My Newest Thread of Insane Ramblings

I've always been somewhat of a systemic malcontent, I've never met a way of doing things that I couldn't find some way to improve. It used to be that these were little gripes about the minutia of how this aspect or that aspect could be marginally improved, but over the past few years I've become a little bit more radical. Now by radical, I don't mean I want to go destroying things. I mean radical in the sense that I'm sick of systems which are blatantly behind the times, even dangerously so. These are systems which had serious problems in the beginning and which haven't really evolved into something better as time has progressed. Some have even evolved into things which are actually worse. Now it's important here to define what I mean when I say good, because it is slanted heavily:

Sustainable: A system must be designed and built in such a fashion that it may exist and perform without causing environmental damage. When possible, a system should actually contribute to overall environmental health as a byproduct of its existance/operation. Additionally, materials used in the construction/operation of said system should be "Cradle to Cradle" materials. It doesn't take much introspection to see that the way we're doing things now is woefully destructive to the environment and by direct extension to us. We need to be building systems which are part of the solution, not the problem.

Empowering: When possible/applicable a system should empower those who use it to become better individuals. We aren't used to thinking about systems this way but its about damn time. By not thinking about it, we've created an entire world full of systems which isolate, belittle, frustrate, and generally do nothing more than the bare "utility" we've proscribed to them. By building systems which encourage those who use them to become happier, healthier, and more aware of the world, the more favors we will be doing ourselves in the long run. No one product or system will ever grant us sainthood, but at the very least we can start building systems which aren't fighting us. I'm reading a book on this very topic right now called Sustainability By Design. It's a chewy read as the author takes you through a very thorough case for his ideas, but definitely worth a read if you like this kind of stuff.

Practical/Pragmatic: Sometimes I see proposals for systems floating around on the Net and I just have to laugh. The ideas are inventive and the intentions are admirable, but the systems as described just couldn't be made to exist without seriously changing the way the population at large is used to using technology. A system must be practical to build, practical to operate, require only minimal adjustment by those who use it, and must be useful and effective from day one.

Incremental Deployment: Any system which cannot be built/deployed in a incremental fashion is doomed to forever remain a drawing on a designers' desk. New systems, like internet memes, need to start small and spread virally based on their success. This is one of the reasons its taken the US so damned long to get high speed rail; nobody wants to pay for a half-built network of rail lines.

Low Upkeep Costs: Systems by their nature do require investment over time to keep them effective and up to date, however a system which gobbles money continuously is a bad system at best.

Flexibility: A system must be able to adapt/be adapted to fit a wide range of operating conditions and do so with minimal additional cost.

There's a whole constellation of other assorted factors that I feel are crucial to a good system, and I'm sure I will remember all of them precisely two minutes after I post this entry. It is worth noting that what I call "systems" includes a wide variety of things including but not limited to products, infrastructure, and services.

So with all of this in mind, I am starting a series of posts which I'm entitling Build It Better. In these, I will be singling bad systems out, holding them down, and mercilessly beat them to a bloody pulp. I will then proceed to suggest my own brands of systemic villany, because I'm an egotistical tool who likes hearing himself talk (at least I'm honest about it). Feel free to follow along as I decend further into my caffiene induced hallucinations of a world filled with systems which don't suck.

Continue Reading

Sunday, September 13, 2009

0
Machinima Viewer: Take 2

At this point I'm fairly sure its no mystery that the concept of utilizing the Metaverse for machinima fascinates me. While it is still in a awkward stage, I feel that machinima will one day be considered yet another legitimate form of storytelling. The Metaverse holds particular promise as a production platform because unlike other systems, the creation of unique content is widespread, it is stylistically flexible, and it is unconstrained by conceits of what should be done with the platform. This coupled with the ability for widespread collaboration and less draconian copyright policies makes the system ideal for the production of machinima. With that in mind, it is worth highlighting the fact that while the potential for greatness exists, the tools to leverage this potential have yet to be built.

In a previous post, I described a very ambitious viewer project that would have allowed machinimatographers (who I'm going to call 'Toggers from now on for sanity's sake) to capture and edit machinima footage in new and powerful ways. While I still hold that system as an ideal, several months of continued introspection has revealed that the construction of such a system would be a herculanean effort with technical challenges that few are willing to tackle at this point in time. A different approach must be taken as a first step. A 'Togger-friendly viewer is possible with existing code, and advanced functionality could be added as development continues. First, a existing viewer could be stripped down to the functionality relevant to 'Toggers and then enhanced with new functions.

Here's a list of the basic functionality a Togger would need for setting up a scene and shooting:
Avatar movement
Camera controls
Teleportation
Basic communication (Chat, IM)
Inventory access
Object placement and positioning

Advanced Functionality:
Fine control over camera behavior (maximum speed, camera roll, zoom control)
Local camera view bookmarking
Bookmark-to-bookmark camera animation
Timeline-based camera animation controls
Aspect ratio window sizing
Shot composition screen overlays
Easier toggling of interface elements
Triggered animation control over actor's avatars (RestrainedLife API?)
Real-time shadows
Built-in footage capture (ditch FRAPS)
Local texture substitution

There is even more functionality that could be added, but that's for another post. I've restrained myself from giving this one a cheesy psuedo-title in part because I don't want to presume on someone else, and partly because I just can't think of anything else witty right now. Any takers?


Continue Reading

0
Davinci: Open Source Engineering Tools (Part 2)

Continued from Part 1

So, without further pontification, let me introduce you to the project I used to call Starshine, but that I now call Davinci (with no small amount of irony). Davinci is a family of four programs designed to work in close conjunction with each other: Scrawl, Workshop, Notion, and Plaza. Scrawl is a CAD/CAE solution designed to allow contributors to design and modify individual pieces easily and efficiently while remaining faithful to the requirements of the larger project. Workshop is a high-level assembly interface for connecting together designs made with Scrawl into larger, more complex projects. Notion is a modular simulation framework which is utilized by Scrawl and Workshop to simulate a wide range of behaviors and conditions. Finally, Plaza is a online versioning system akin to a SVN but custom built for organizing and storing large Davinci projects. Together they form a tight ecosystem of functionality that allows widespread collaboration. Allow me to describe each system with a bit more detail.

Scrawl's main functionality is fine-level design of individual parts within a larger project. Like most CAD/CAE programs, it would allow for both schematic and parametric (perspective) views of a design, the ability to define the solid geometry of the design, and define the physical properties of parts. To facilitate flexibility, Scrawl would allow the import and export of geometry from other programs outside of Davinci, a feature also shared by many CAD/CAE programs. However, Scrawl would allow for all of this within the context of the larger design. Should such limitations exist, the interface will show the physical space constraints, connection points, and other relevant data in relation to the design. The design can be tested by Notion, which will in turn create a metadata file, including the performance of the piece for use by Workshop. The files Scrawl will save out will be rich in metadata, including physical properties as well as the creator's name and the intellectual property license under which the creator wishes the work to be shared.

By itself Scrawl is nothing particularly special. Its true potential comes from its tight integration with Workshop, which allows for the assembly of higher-order designs. Workshop is all about applying a object-oriented approach to the design of multi-part inventions. Users can load Scrawl files and even other Workshop files into a single assembly environment. From the interface, users can combine these sub-components together much in the same manner LEGO piece can be pieced together. The user can define the level of binding between individual pieces, approximating bolts, welds, or greased joints. The user can inspect the metadata of each individual piece, as well as open it, either opening its respective Scrawl file or its Workshop file. This power to include other Workshop files, I feel, is a must. It is the equivalent of the #include command I mentioned earlier, as it allows for development of sub-components independent of the larger project. However, simply assembling these higher order designs is somewhat constrained in utility without the ability to test the system as a whole. This is where Notion comes into play.

Rather than be a monolithic simulation program such as those used by most CAD/CAE systems, Notion is far more akin to a rendering engine framework such as those found in professional 3D animation packages. The key thought behind this is flexibility. Different users will want to test for different things, and different projects will necessitate very different kinds of tests. For example, the designer of a airplane may not care much about simulated crowd-flow within a structure, but it is a essential consideration for the designer of a subway station. Conversely, the designer of a subway station probably cares little about hydrodynamic flow, whereas it is absolutely crucial for a airplane's design. This is a simplistic example, but the point is that forcing a single set of simulation tools not only limiting to users, it also limits the applications of Davinci. Notion would act as the ambassador between simulation engine modules and Scrawl/Workshop. A user looking to test a design would select the elements to test in either Scrawl or Workshop, and then specify the test and simulation engine to use. Notion would then glean the needed data/metadata from the selected elements and feed the information to the simulation engine. The output from the engine would be fed back through Notion and displayed within the interface of the originating program. Notion's functionality would not stop there however. For Workshop files, Notion would allow for multi-level simulation. This would allow for tests to be performed not just on static proxies of sub-components, but on a level of to-the-part accuracy. While far more computationally expensive, it would allow for the capturing of “gotcha” mistakes, such as a unintended weight shifts due to a sub-components movement or a unexpected loss in performance due to cross-component heat pollution. The aim of such functionality is to come as close as possible to a fully realistic virtual prototype. While perfection of such a prototype is probably out of the reach of any system in the near future, something close could be attainable.

With all these components and simulations creating piles of data and metadata, some sort of organizational system would be critical for any serious collaboration. This is where Plaza becomes crucial. Plaza would be a server platform consisting of several different services. The most critical service would be a specially-designed SVN system would intuitively and securely archive the data generated by the innovation process. A second service in Plaza would allow it to act as a abstracted simulation module for Notion. This service would allow for Notion to leverage large clusters of connected servers for especially complex simulations. To facilitate real-time collaboration on a single file, a service based on the Uni-Verse code-base would also run on Plaza. This would allow multiple contributors to collaboratively work on a unified design simultaneously, a important feature for when designs get above a certain level of complexity. One final service would be a API allowing third-party applications to securely access the data stored in the repositories. This will allow developers to expand upon the family of applications that can leverage Davinci. Such applications might include statistical comparison software for comparing the technical merits of different design variations, or a virtual reality walk-through of designs. The possibilities are endless, which is why creating a robust and flexible API would be so crucial.

I think I've rambled enough on this system for now. This was meant to be first and foremost a conversation starter, so I look forward to you thoughts. I most certainly lack much of the technical expertise it would require to build such a system, which is why I would very much like to see such a system developed as open source. There's something poetic about open source being the key to the creation of open source hardware. In short, if this has sparked a interest in you, feel free to adopt the concept and dive into fleshing this out.


Continue Reading

0
Davinci: Open Source Engineering Tools (Part 1)

Tonight I want to talk with you about something I have struggled with for some time to figure out. This idea has been with me for close to if not over a year, and yet until recently I was finding it very hard to describe to others. I even posted about it once in my old ideas blog, but even as I wrote that version I felt frustrated by my lack of clarity on the topic. This is my second opportunity to do it justice.

I am, as you may have gathered, a advocate of tools. While it is becoming clearer by the day that humans are not the only animals with the ability to conceive and utilize tools to achieve our aims, it is one of our defining characteristics which has allowed us to thrive as a species. This is one of the reasons I am such a rabid advocate of open source. It calls on the better angels of human nature to facilitate collaboration in the search of better tools, accessible to all. Yet despite all of the wonders open source has provided the world of software, I would hazard that I am one of many who feel that open source must expand beyond the realm of software to deliver its best gifts to humanity. Open source must breach the divide and become a tool for the innovation of corporeal inventions. Several attempts at this have already been made, or are under way. However, their dream will never reach full fruition without confronting a basic reality: Advocates of open source hardware lack the equivalent tools that their software compatriots take for granted. Without these tools, open source hardware cannot achieve the same success that open source software has enjoyed.

When a coder sits down at a computer to write a program, they have all the tools necessary for the act of creation and collaboration at their fingertips. Code can be written in a free text editor or software editing application. That code can be compiled, for free, by a compiler residing on the very same computer. The coder can test the fruits of their labor for free and in most cases, without fear of harming themselves, their computer, or their work. Now to be fair all of this functionality can be mimicked by a engineer, utilizing a CAD/CAE program. A engineer can design a piece of machinery and test its basic functionality, safely and (depending on the software) fairly cheaply. In this regard, the two systems are relatively similar. However, the differences begin to emerge when collaboration, a essential ingredient of any open source project, comes into the picture.

The power of open source derives from the ability of a individual coder to contribute a relatively small amount of work which can easily be merged into a larger, more complex project. The core of open source is the acknowledgment that not everyone is Superman, and that many people contributing just a little can add up to something greater than its part. A coder working on a open source project can easily download all or part of the larger project, make changes, compile the entire project and test it. Adding the work of others to a existing program is also relatively painless, requiring only a few lines of code to instruct the program how to access the new code and to call its functions when needed. Indeed, the command #include and its kin are one of the most powerful commands from the perspective of open source. They are powerful because they allow for a single coder to quickly add the work of another coder, making collaboration not only easy, but in many cases easier than working completely alone. This is where the design of real things runs into trouble. A team of committed, determined engineers looking to create a large open source design, they will quickly realize that while they can all design individual pieces and test them all individually, there is practically no way to test the entire system without building a physical prototype and testing its performance. While this might be a acceptable solution for something small, simple, and cheap to build, it becomes a serious problem for larger projects. The average contributor is most likely a person of modest means, and probably could not afford to build a functioning prototype of something large, like a car, building, satellite, or playground. There should be a tool that empowers the average contributor to the same level that a simple compiler empowers a coder. Such a tool should allow a contributor to build, edit, test and share large complex projects. That is what I shall attempt to describe.

Initially, I envisioned such a tool as a monolithic piece of software. This one program would handle all functionality, from the design of individual elements, all the way to the testing of large complexity projects. I based this initial notion off the analogy of a software development environment, where code writing, project organization, and testing functionality were all part of the same program. Understandably, this system became very hard to describe, as I tried to describe a family of functionality while retaining the notion of a singular program. It wasn't until recently that I realized what I was really looking for was a close-knit ecosystem of smaller, function specific programs. Once broken down into functions, the system is suddenly much easier to conceptualize, and hopefully easier to describe.

Imagine my embarrassment...

Continued in Part 2


Continue Reading

Wednesday, August 19, 2009

0
KISS

Any of you who are frequent readers (which isn't many of you, judging by the allmighty Analytics) may have noticed the ever-changing look of the blog. For a while I got kind of fussy over its appearance and bit off more than I could chew. The end result was a fugly template with borked code and a frustrated blogger. So, I've returned to a basic little default template. Is it fancy looking? Not really, no. Does it allow people to actually read the blog without seeing template vomit? Yes!

Simplicity, what a concept.

Continue Reading