Sunday, September 13, 2009

0
Machinima Viewer: Take 2

At this point I'm fairly sure its no mystery that the concept of utilizing the Metaverse for machinima fascinates me. While it is still in a awkward stage, I feel that machinima will one day be considered yet another legitimate form of storytelling. The Metaverse holds particular promise as a production platform because unlike other systems, the creation of unique content is widespread, it is stylistically flexible, and it is unconstrained by conceits of what should be done with the platform. This coupled with the ability for widespread collaboration and less draconian copyright policies makes the system ideal for the production of machinima. With that in mind, it is worth highlighting the fact that while the potential for greatness exists, the tools to leverage this potential have yet to be built.

In a previous post, I described a very ambitious viewer project that would have allowed machinimatographers (who I'm going to call 'Toggers from now on for sanity's sake) to capture and edit machinima footage in new and powerful ways. While I still hold that system as an ideal, several months of continued introspection has revealed that the construction of such a system would be a herculanean effort with technical challenges that few are willing to tackle at this point in time. A different approach must be taken as a first step. A 'Togger-friendly viewer is possible with existing code, and advanced functionality could be added as development continues. First, a existing viewer could be stripped down to the functionality relevant to 'Toggers and then enhanced with new functions.

Here's a list of the basic functionality a Togger would need for setting up a scene and shooting:
Avatar movement
Camera controls
Teleportation
Basic communication (Chat, IM)
Inventory access
Object placement and positioning

Advanced Functionality:
Fine control over camera behavior (maximum speed, camera roll, zoom control)
Local camera view bookmarking
Bookmark-to-bookmark camera animation
Timeline-based camera animation controls
Aspect ratio window sizing
Shot composition screen overlays
Easier toggling of interface elements
Triggered animation control over actor's avatars (RestrainedLife API?)
Real-time shadows
Built-in footage capture (ditch FRAPS)
Local texture substitution

There is even more functionality that could be added, but that's for another post. I've restrained myself from giving this one a cheesy psuedo-title in part because I don't want to presume on someone else, and partly because I just can't think of anything else witty right now. Any takers?


Continue Reading

0
Davinci: Open Source Engineering Tools (Part 2)

Continued from Part 1

So, without further pontification, let me introduce you to the project I used to call Starshine, but that I now call Davinci (with no small amount of irony). Davinci is a family of four programs designed to work in close conjunction with each other: Scrawl, Workshop, Notion, and Plaza. Scrawl is a CAD/CAE solution designed to allow contributors to design and modify individual pieces easily and efficiently while remaining faithful to the requirements of the larger project. Workshop is a high-level assembly interface for connecting together designs made with Scrawl into larger, more complex projects. Notion is a modular simulation framework which is utilized by Scrawl and Workshop to simulate a wide range of behaviors and conditions. Finally, Plaza is a online versioning system akin to a SVN but custom built for organizing and storing large Davinci projects. Together they form a tight ecosystem of functionality that allows widespread collaboration. Allow me to describe each system with a bit more detail.

Scrawl's main functionality is fine-level design of individual parts within a larger project. Like most CAD/CAE programs, it would allow for both schematic and parametric (perspective) views of a design, the ability to define the solid geometry of the design, and define the physical properties of parts. To facilitate flexibility, Scrawl would allow the import and export of geometry from other programs outside of Davinci, a feature also shared by many CAD/CAE programs. However, Scrawl would allow for all of this within the context of the larger design. Should such limitations exist, the interface will show the physical space constraints, connection points, and other relevant data in relation to the design. The design can be tested by Notion, which will in turn create a metadata file, including the performance of the piece for use by Workshop. The files Scrawl will save out will be rich in metadata, including physical properties as well as the creator's name and the intellectual property license under which the creator wishes the work to be shared.

By itself Scrawl is nothing particularly special. Its true potential comes from its tight integration with Workshop, which allows for the assembly of higher-order designs. Workshop is all about applying a object-oriented approach to the design of multi-part inventions. Users can load Scrawl files and even other Workshop files into a single assembly environment. From the interface, users can combine these sub-components together much in the same manner LEGO piece can be pieced together. The user can define the level of binding between individual pieces, approximating bolts, welds, or greased joints. The user can inspect the metadata of each individual piece, as well as open it, either opening its respective Scrawl file or its Workshop file. This power to include other Workshop files, I feel, is a must. It is the equivalent of the #include command I mentioned earlier, as it allows for development of sub-components independent of the larger project. However, simply assembling these higher order designs is somewhat constrained in utility without the ability to test the system as a whole. This is where Notion comes into play.

Rather than be a monolithic simulation program such as those used by most CAD/CAE systems, Notion is far more akin to a rendering engine framework such as those found in professional 3D animation packages. The key thought behind this is flexibility. Different users will want to test for different things, and different projects will necessitate very different kinds of tests. For example, the designer of a airplane may not care much about simulated crowd-flow within a structure, but it is a essential consideration for the designer of a subway station. Conversely, the designer of a subway station probably cares little about hydrodynamic flow, whereas it is absolutely crucial for a airplane's design. This is a simplistic example, but the point is that forcing a single set of simulation tools not only limiting to users, it also limits the applications of Davinci. Notion would act as the ambassador between simulation engine modules and Scrawl/Workshop. A user looking to test a design would select the elements to test in either Scrawl or Workshop, and then specify the test and simulation engine to use. Notion would then glean the needed data/metadata from the selected elements and feed the information to the simulation engine. The output from the engine would be fed back through Notion and displayed within the interface of the originating program. Notion's functionality would not stop there however. For Workshop files, Notion would allow for multi-level simulation. This would allow for tests to be performed not just on static proxies of sub-components, but on a level of to-the-part accuracy. While far more computationally expensive, it would allow for the capturing of “gotcha” mistakes, such as a unintended weight shifts due to a sub-components movement or a unexpected loss in performance due to cross-component heat pollution. The aim of such functionality is to come as close as possible to a fully realistic virtual prototype. While perfection of such a prototype is probably out of the reach of any system in the near future, something close could be attainable.

With all these components and simulations creating piles of data and metadata, some sort of organizational system would be critical for any serious collaboration. This is where Plaza becomes crucial. Plaza would be a server platform consisting of several different services. The most critical service would be a specially-designed SVN system would intuitively and securely archive the data generated by the innovation process. A second service in Plaza would allow it to act as a abstracted simulation module for Notion. This service would allow for Notion to leverage large clusters of connected servers for especially complex simulations. To facilitate real-time collaboration on a single file, a service based on the Uni-Verse code-base would also run on Plaza. This would allow multiple contributors to collaboratively work on a unified design simultaneously, a important feature for when designs get above a certain level of complexity. One final service would be a API allowing third-party applications to securely access the data stored in the repositories. This will allow developers to expand upon the family of applications that can leverage Davinci. Such applications might include statistical comparison software for comparing the technical merits of different design variations, or a virtual reality walk-through of designs. The possibilities are endless, which is why creating a robust and flexible API would be so crucial.

I think I've rambled enough on this system for now. This was meant to be first and foremost a conversation starter, so I look forward to you thoughts. I most certainly lack much of the technical expertise it would require to build such a system, which is why I would very much like to see such a system developed as open source. There's something poetic about open source being the key to the creation of open source hardware. In short, if this has sparked a interest in you, feel free to adopt the concept and dive into fleshing this out.


Continue Reading

0
Davinci: Open Source Engineering Tools (Part 1)

Tonight I want to talk with you about something I have struggled with for some time to figure out. This idea has been with me for close to if not over a year, and yet until recently I was finding it very hard to describe to others. I even posted about it once in my old ideas blog, but even as I wrote that version I felt frustrated by my lack of clarity on the topic. This is my second opportunity to do it justice.

I am, as you may have gathered, a advocate of tools. While it is becoming clearer by the day that humans are not the only animals with the ability to conceive and utilize tools to achieve our aims, it is one of our defining characteristics which has allowed us to thrive as a species. This is one of the reasons I am such a rabid advocate of open source. It calls on the better angels of human nature to facilitate collaboration in the search of better tools, accessible to all. Yet despite all of the wonders open source has provided the world of software, I would hazard that I am one of many who feel that open source must expand beyond the realm of software to deliver its best gifts to humanity. Open source must breach the divide and become a tool for the innovation of corporeal inventions. Several attempts at this have already been made, or are under way. However, their dream will never reach full fruition without confronting a basic reality: Advocates of open source hardware lack the equivalent tools that their software compatriots take for granted. Without these tools, open source hardware cannot achieve the same success that open source software has enjoyed.

When a coder sits down at a computer to write a program, they have all the tools necessary for the act of creation and collaboration at their fingertips. Code can be written in a free text editor or software editing application. That code can be compiled, for free, by a compiler residing on the very same computer. The coder can test the fruits of their labor for free and in most cases, without fear of harming themselves, their computer, or their work. Now to be fair all of this functionality can be mimicked by a engineer, utilizing a CAD/CAE program. A engineer can design a piece of machinery and test its basic functionality, safely and (depending on the software) fairly cheaply. In this regard, the two systems are relatively similar. However, the differences begin to emerge when collaboration, a essential ingredient of any open source project, comes into the picture.

The power of open source derives from the ability of a individual coder to contribute a relatively small amount of work which can easily be merged into a larger, more complex project. The core of open source is the acknowledgment that not everyone is Superman, and that many people contributing just a little can add up to something greater than its part. A coder working on a open source project can easily download all or part of the larger project, make changes, compile the entire project and test it. Adding the work of others to a existing program is also relatively painless, requiring only a few lines of code to instruct the program how to access the new code and to call its functions when needed. Indeed, the command #include and its kin are one of the most powerful commands from the perspective of open source. They are powerful because they allow for a single coder to quickly add the work of another coder, making collaboration not only easy, but in many cases easier than working completely alone. This is where the design of real things runs into trouble. A team of committed, determined engineers looking to create a large open source design, they will quickly realize that while they can all design individual pieces and test them all individually, there is practically no way to test the entire system without building a physical prototype and testing its performance. While this might be a acceptable solution for something small, simple, and cheap to build, it becomes a serious problem for larger projects. The average contributor is most likely a person of modest means, and probably could not afford to build a functioning prototype of something large, like a car, building, satellite, or playground. There should be a tool that empowers the average contributor to the same level that a simple compiler empowers a coder. Such a tool should allow a contributor to build, edit, test and share large complex projects. That is what I shall attempt to describe.

Initially, I envisioned such a tool as a monolithic piece of software. This one program would handle all functionality, from the design of individual elements, all the way to the testing of large complexity projects. I based this initial notion off the analogy of a software development environment, where code writing, project organization, and testing functionality were all part of the same program. Understandably, this system became very hard to describe, as I tried to describe a family of functionality while retaining the notion of a singular program. It wasn't until recently that I realized what I was really looking for was a close-knit ecosystem of smaller, function specific programs. Once broken down into functions, the system is suddenly much easier to conceptualize, and hopefully easier to describe.

Imagine my embarrassment...

Continued in Part 2


Continue Reading