Showing posts with label Technical Logs. Show all posts
Showing posts with label Technical Logs. Show all posts

Tuesday, December 31, 2013

REGARDING THE FUTURE OF C# ...

I'll be out of office for a few days and before I go let me post some words about the features I'd like to see implemented in upcoming versions of C# (btw, on my return I'll be posting again about The APE).

A couple of weeks ago on "This Week On Channel 9" series there was a reference to Mads Torgensen's presentation in London regarding "The Future of C#", announcing new features that could probably get implemented in C# 6.0.

So, in this post, let me explain some of the features that I hope they implement in C# in the short/middlle run:

1. LLVM-Like Compiler For Native Code


I talked about this many times, but I think it's a perfect time to mention it again.

So far, if you want to compile MSIL to native code at once, you can use a tool called NGen, which creates a native image of the code for the machine where compilation is being done. The problem with this tool is that its main purpose is to reduce startup times. Meaning? You won't get optimized bits for the whole code; just for the blocks first executed when the program starts.

Imho, we need more ... in this relatively new world of app marketplaces it'd be really handy to count on a model where you can deliver optimized native bits to the device/console/machine where the app would be downloaded and installed, don't you think?

Picture it like this: say you create an app/game with C# for the XBox One (using portable assemblies or not) and compile your source code to MSIL. Since the hardware of the console is the same in terms of main components (processor, memory, etc.) then why not compiling the whole MSIL code to native bits optimized for the Xbone console at once? (either on your side or on MSFT servers' one)

With a LLVM-Like compiler this could be achieved and extended to other platforms. But wait a minute! Isn't it what MSFT is doing for WP8? It sounds like it. But wait a minute, again! Isn't it something like the AOT compilation that can be found in the Mono Framework? If the latter gives optimized bits for whole assemblies per platform then it is!

In fact, many sources mentioned the so colled "Project N", which would be used to speed up Windows 8.1 apps. What is more, a few sources also mention that MSFT is working in a common compiler for C++/C#. I hope it also brings a more performing way to do interop stuff with C++.

True or not, this is a "must have" in order to end the C++/C# perf discussion!

2. "Single Instruction, Multiple Data" (SIMD)


In modern HW architecture, SIMD has become a standard when you want to boost performance in specific operations, in particular, ("vectorized") mathematic ones.

As a mather of fact, C++ counts with DirectXMath libraries (based on the formerly called XnaMath ones) which do implement SIMD, but unfortunately do not support C#.

Again, SIMD is already present in the Mono Framework for Math operations, so why not adding it to the .NET Framework once and for all?

I hope MSFT listen to us ...

3. Extension Properties


We have extension methods, so this is a corolary of it. Today, you can only implement getters (and setters) like this:

   public static string NameToDisplayGetter (this IPerson person)
   {
      ...
   }

Then, why not having something like this?

   public static string NameToDisplay: IPerson person
   {
      get { ... } // You could also add a setter, if needed.
   }

Of course that the syntax in the example above may vary, so I guess you get the idea here. There are several use cases where a feature like this could come handy, including MVVM or plain INotifyPropertyChanged ones.

4. Generic Enum Contraints


Generics is one of my favorite .NET features. There's lot of things that can be achieved through it but it has still room for improvement.

One of the things to improve are constraints. So far, when dealing with types of software elements, we have only two: class and struct. So, what about enums?

Currently, if you want to mimic an enum constraint you will have to write something like ...

   public void OperationX(TEnum myEnum)
      where TEnum: struct, IComparable, IFormattable, IConvertible ... and so on so forth.
   {
      … usual stuff …
   }

... and also, given that you are dealing with a subset of elements that approximate to an enum, you need to check whether and enum has been passed, generally throwing an exception if not:

    if (!typeof(TEnum).IsEnum)
   {
      throw new ArgumentException("The passed type is not an enum");
   }

Why not simplify it to something like this?

   public void OperationX(TEnum myEnum)
      where TEnum: enum
   {
      … usual stuff …
   }

Not only it makes sense, but also would simplify things a lot as well as open a wide range of handy operations and extension methods.

5. NumericValueType Base Class


.NET's CLR treats structs in a special way, even though they have a base class: ValueType.

I'll not be explaining here the characteristics of built-in primitives and structs; instead, I'll ask the following question: in the current version of C# can we achieve something like this ...?

   TNumeric Calculate(TNumeric number1, TNumeric number2, TNumeric number3)
     where TNumeric: struct
   {
       return number1 + number2 * number3;
   }

The answer: not without proper casts. So, a life changer for this type of situations would be to add a specification of ValueType that enjoys the benefits of structs and also supports basic math operations without any kind of wizardry: NumericValueType.

With that class and a new reserved word like, say, "numeric", "number" or "primitive", we could write generic operations and extension methods with a syntax as simplier as:

   TNumeric Calculate(TNumeric number1, TNumeric number2, TNumeric number3)
     where TNumeric: numeric
   {
       return number1 + number2 * number3;
   }

How about declaring new type of numbers? Easy ...

   public numeric Percentage
   {
      … usual stuff …
   }

... or ...

   public numeric Half
   {
      … usual stuff …
   }

No need to specify "struct" since "numeric" would be a value type that supports basic math operations (that we would need to implement when we declare the type, maybe, by overriding some base operations), and so in common scenarios there would be no need to cast values to do math.

6. Declaration of Static Operations On Interfaces


Put simply: having the possibility of declaring static operations when writing interfaces; like this:

   public interface IMyInterface
   {
      static void DoStaticOp();

      static bool IsThisTrue { get; }

      ... instance properties and operations ...
   }

This presents a challenge to both, polymorphic rules and abstract declarations, that is, at a "static" level. But as usual, with the proper rationale and care when modifying the language specs, it could be achieved. Maybe, many of you would be asking: "why bother?" but believe me when I say that I happened to meet situations where static operations on intefaces would have come really handy.

Well, this is it for today. What would you guys like to see implemented in C# 6 and beyond? Comments are welcome.

See ya when I get back,
~Pete

Thursday, December 19, 2013

THE ASSET PIPELINE EDITOR - PART 6

After a short break to do some marketing about the APE (sigh!), it's time to resume posting so the turn goes to a couple of recent additions I implemented into the core functionality:
  • "Direct" build (what?), and
  • Asset Packagers (oh yeah, baby!)
These features have been requested by some of you, so here they are ...

Direct Builds


In part 4 of the series I mentioned that there were two actions when compiling assets, the first one was "build" and the second one "copy/move", and the way to indicate whether to copy or move assets was by (un)checking the following checkbox:


Well, as a corollary of the command-line tool "APEBuild", a new third option may take place: direct build. Now you can indicate the APE to compile assets directly on the output folder instead of doing it on the "local" folder. In short, it's a way to reduce steps. No need for move or copy.



So, you will have three options from now on: "Copy Assets", "Move Assets" and "Direct Build". Needless to say that the "Default Copy Action" field ("Copy Always", "Copy If Newer" and "None") will be only enabled when "Direct Build" is disabled and that this feature can also be used with APEBuild.

In the picture above, if you look at it carefully, you will have a preview of what I'll talk next ...

Asset Packagers


Throughout the years, as a moderator in the creators' forums, I read many questions regarding the possibility of zipping your whole content folders. Recently, it has been brought to my attention (thanks again Javier) that some authorware offers the possibility of compressing all output into a single zip file.

Guess what? Now you can also do it with the APE!

How come? Simple ... you create your own packager (where you can use any of the compression techniques available in .NET 4.5, use third-party libraries or add your-own compressor) and when you plug it into the APE, the latter will show it as an option on the General Settings tab:


By default, as usual, there is a "pass-through" packager built-in, named "No Packaging", which we can use for the cases where no zipping is required. 

So, when you create/load a solution, its panel will look like this:



We can then select the packer along with the writer to set as the default one for the solution (which of course, can be overriden per project). But that is not it ...


The picture above shows a new field to set in the Project Settings tab: "Pack To". By default, that field will initially equal the path assigned to the "Copy To" field, that is, the output path.  But you can change it, in case you need to have a different target directory for the zipped file.

This opens a new set of possibilities since selecting a packager won't disabled copy/move operations, and vice versa. So, if you decide to copy assets and also create one huge compressed files for the whole structure, you can set a different path for the packed files and presto! Do you want just the packed files? No problem, set either the direct build or activate copy/move actions with "None", and you will only get the zipped file.

As I say, you can build, build+copy/move, build+package, build+copy/move+package, direct build, direct build + package. Pick the right combination for your project's need.

Please notice that the packaging action is meant to apply to a whole structure of assets and not to each asset individually; in other words, alike assets writers -which affect each raw file individually as part of the import/export process even though they are applied per project, asset packagers affect a structure of folders and asset files per project. Say you have a folder named "Content" with all the asset files built in the last execution of the command, then this folder is the one the packager will take as a reference to create and output the compressed file (or the compressed files you decide to output).


Last but not least, the "Writers" tab has been renamed to "Output Providers" given the fact that it now includes a configuration panel for packagers, where you can select the packager to use per compilation profile and configure its properties, if any is available.

In the example above, no packaging is set for the Windows target platform and the "Debug" compilation profile. Plus, since this default packager is a pass-thorugh one, there are no public properties you can tweak.

And again, this feature is automatically included in APEBuild!

To sum up ...


With these two additions, the APE now covers several use cases: from usual ones to the weirdest! Thus, I am eager to see what you guys come up with when the first released version of the APE gets launched ...

Btw, I continue working on this handy solution as we speak, so new and exciting features are added on a daily basis.

We are close to start the campaign at IndieGoGo so I hope you stick around!!!

'till next post,
~Pete

Friday, December 13, 2013

THE ASSET PIPELINE EDITOR - PART 5

In this part of the series I'll answer some of the questions that some of you've been asking me lately, in particular:

  • Does the APE watch source files?
  • Is there a command-line version of it? And
  • Why not publish it as open-source?
I attempted to give brief answers to a couple of them in this thread at GameDev, and also by email to the guys behind the WaveEngine, but I think they deserve a post here with further details.

So, let's begin ...

Does the APE watch source files?


To aswer this question, I need to explain what happens when a new solution is created and saved.

Basically, after saving a solution you will find the following structure on disk:

   + Root Path
      + [Solution's Name] folder
         - [SolutionFileName].fps
         + "Sources" folder
             - sourcefile1
             ...
             - sourcefileN
         + "Projects" folder
            + [Platform1's Name] folder
                - [ProjectFileName1].fpp
               + "Builds" folder
                  + [Profile1] folder
                     - output.fpb
                    + Content folder
             ...
            + [PlatformN's Name] folder
                - [ProjectFileNameN].fpp
               + "Builds" folder
                  + [Profile1] folder
                     - output.fpb
                    + Content folder

The APE creates a folder named "Sources" which will be used as a "local" repository for the whole solution. Within it, you will only find files (no folders). Thus, when you add a new file to any of your projects, the APE will copy that source file to the respository and create the corresponding raw file to the strcuture of your solution.

Following this rationale there is no need to dynamically watch file changes. Why? Simple, if you manually change one of the source files directly then the next time you build content that source file will be used to build assets provided it complies with the condition indicated for building: Always|New|None.

In other words, the APE watch changes over an existing source file only at the moment that new builds are requested. If a project is marked as "Build Always" then no matter what, all included raw files will generate a new asset file. If a project is marked as "Build If New" then only raw files with new source files assigned will end up having a new asset file. Finally, "None" will exclude the project from the build process.

Now, there was a second part in the question posted on GameDev's forum with had to do more with the processing-side of things than with what I've explained above.

The APE will NOT replace production tools like Photoshop, Sound Forge, and so on so forth. So you will need to create your source files there: jpegs, wavs, mov, etc.

What the APE provides is a way to indicate how to process source files to get the file format you need for your games. In case the built-in import/write units or the ones provided later on by me and or any other user are not useful to you, then you can implement your own with full control over them.

So, if you guys want to implement a processor that converts WAVs into OGGs, you can go ahead and do it with ease. What about resizing a texture? Sure. What else? Everything you can imagine of that can be achieved by setting parameters on a property grid.

For example, for the case of XNA'ers, in part 4 of the series I showed a processor with many features that pre-multiply alpha, resize textures, change formats and so on.

So to sum up this part of the question, to create source files you will need to use production tools. But to import them to your games with as asset files with given format, you can use the APE.

Is there a command-line version of it?


Yes, there is! And it's name is "APEBuild" (thanks Javier for suggesting the name!).

When I designed the APE I took into consideration server-side-like use cases. As a matter of fact, it resulted as a corolary when I develop the base test assembly for import/write units (please refer to part 1 to see an image of it).

In the current state of this command-like tool, only two actions can be executed: either you build an entire solution or only a set of projects. Let's have a look of the structure ...


The picture above shows what you get when execute the tool with no parameters (and also with wrong parameters).

So, if you want to build a solution, just execute the tool with one parameter: the path to the solution file. And if you want to build some of the projects in the solution, then add "p:" as an argument, followed by the names of the target platforms, separated by a comma. See the example in that picture.

Now, there are a few restrictions: first, the solution filename must always end with ".fps"; second, the tool will handle trimmed versions of the platforms' names; third, all passed platforms must exist in the solution or the tool won't execute; and fourth, when you pass a relative path to the solution, the path to the folder where APEBuild is located will be considered the root folder in order to build the absolute path.

So, when we execute the command for the entire solution we have been using as an example on the series, this is the result for a successful build:


And when you execute the command for a couple of pojects, the result would be the following:


You will also get messages in case of warnings and exceptions:


The above picture shows a warning indicating that an import unit that should have been plugged into the tool as an add-on is missing. However, since it's not used during the build process, the latter runs normally.

So, it is important to remember that:

1. Before executing the command you will need to check that all the import/write units are present in the corresponding folders associated to APEBuild (as you would also do with the APE's editor), and

2. When you commit source files, you will also need to commit the updated versions of the APE's solution/project files to the server or otherwise you won't get the results you were most-likely looking for.

There are some features I'd like to add in future versions like, say, verbosity control (that is, the level of detail you get as output), but the tool gets the job done in its current state, what is really handy!

Why not publish it as open-source?


I'd have prefered to address this question more close to the campaign's launch date, but since a few of you've asked this question recently, I decided to answer it now.

But before moving forward, I'd like to state that I will neither argue nor open a discussion regarding whether open source is good or bad business-wise, since that depends on factors whose relevance may vary per person (yes, "YMMV") and therefore not only does it lay beyond the scope of this post but also I don't feel like pursuing a Phyrric Victory.

Instead, I'll be posting a few words explaining my decision to publish it as a commercial tool in the near future -that is, provided the campaign at IndieGoGo succeeds.

Honestly, I haven't decided yet the price for a license but I intend to license the APE per seat per platform per major version. Yes, if you buy one license at launch, you will be able to use it for the whole v1.x! No annual subscriptions, no different versions for indies/pros, no complications to anyone.

Now, although the price is not yet decided, believe me when I say that it will be low and even lower for those of you who decide to contribute to the campaign at IndieGoGo. I'm an indie, so I know what it feels not being able to afford licenses from time to time. So during the campaign it'd be like going out with some folks to a movie theatre, say, on Friday's night.

So, why this decision?

First, open souce is difficult to keep alive in time. You need to coordinate efforts, check contributor's code, handle branches, even maybe, at some point and to some extent, include contributors into the decision-making process.

But that is not it, most of the people in the team would likely have daytime jobs, so development of updates to the tool would be done during the night provided the is some spare time left. Going to college? Does your job demand most of your productive time? Have a wife/husband? How about some kids? Then, you know the drill ...

It's not a surprise that many open source APIs and tools eventually follow the commercial route, or that their owners publish a letter indicating why the cannot continue working on it or that updates will slow down. It's completely understandable! There's lot of time, effort and even money put into it, and even though donations could be received, the latter eventually end up being not enough to even cover costs of production. Not to mention, costs of living.

So, instead of trying the open source route first to then follow the commercial one, I prefer to skip that part and commercialize licenses of the APE from square one. Succeeding in this task will assure the continuation of the tool since I will dedicate, not my spare time, but my production one to make it happen. And if I fail I'll continue to use it as is for inhouse projects. No hard feelings.

I have one more thing to add in this respect as an example: the guys behind the Mono Framework started the project as a non-profitable endeavour. But then, they realized that in order to continue offering the products they loved to develop, a change in course was imminent. For many, this could have been a change in principles but for me it was a wise decision. Today, they're successfully runnning Xamarin, they're growing strong and their products are a must have for every serious dev that want to port .NET-based apps/games to many platforms. And even MSFT recognizes it!

Btw, regarding XNA's Content Pipeline: it was freely available as long as you didn't want to develop games for the XBox 360 (and then, the Windows Phone). Otherwise, you had to pay an annual subscription for the Creators' Club and a registration fee for the Windows Phone (both now unified).

So to wrap up this third question, before using the word "disappointing" -given the fact that it won't be open source, please give the APE an opportunity to show off its key features and wait for the campaign at IndieGoGo. You won't be dissapointed!

Ok, this is it for this part of the series. Hope you all come back for the upcoming part 6 next week.

See ya next time,
~Pete

PS: btw, I recommend you to have a look at Xamarin's subsciption plans if you haven't yet!

Wednesday, December 11, 2013

THE ASSET PIPELINE EDITOR - PART 4

On part 3 of the series, I talked about some of the features related to solutions and projects. So now it's time for me to refer to another key feature of the editor: building assets.

The picture below shows the last state of the solution we were using as an example for the series:


Let me remind you that the raw files named "bkgClouds" were bound to different source files: one sized 640x400 texels and another sized 320x200 texels.

Since the APE does not come bundled with any import/write units other than the pass-through ones, y clicking in one of the raw files we'll get the following view on the raw file tab:


Thus, as you can see, the pass-through importer is assigned by default to all the raw files in the solution. This default importer has no properties we can modify since all it does is to copy the source file associated to each raw file to the destination folder without any kind of processing/formatting.

However, there is one property that I'd like to highlight here that I didn't mention on my previous posts, and that is the checkbox named "Build Asset ?". If you have been following the series then you'll likely remember that for solutions and projects you could indicate the actions to execute for building and copying/moving, right? Well, this checkbox allows us to override those settings for a specific raw file, so if we unmark it, no build/copy actions will be applied to that raw file.

In fact, if you carefully look at the first screenshot above corresponding to the solution tree, you will notice a checkbox beside some of the nodes. By checking/unchecking that control not only we can include/exclude a raw file for build/copy actions but also the nodes of an entire folder or container. And yes, when you save solutions and projects, this selections are also persisted. Nice!

Now, how can we build an entire solution or a specific project? Glad you asked. Please take a look at the picture below and follow the numbers:


First, for each project at a time, check the Project tab and select the corresponding values for each field. In particular for this example, we need to set the build/copy actions and indicate the correct path to copy/move the asset files. For now, we disregard the Group Id field and the writers grid since we have only one writer we can use at this moment and thus we have no use for the former.

Second, we indicate which nodes among the containers, folders and raw files must be included in or excluded from the build process by checking/unchecking the respective checkbox.

Third, we select the correct writer for the project ("Default Writer") for the compilation profile to use ("Debug", as shown in the picture above).

Fourth, we select whether to build the entire solution (that is to say, all projects) or one project (the one shown in the Project tab). Also, we indicate the target compilation profile (the one that corresponds to your game's executing assemby, as compiled in Visual Studio, Xamarin Studio, Eclipse, Netbeans or other).

Having done that, we also need to indicate for the solution whether we want our asset files copied or moved to their final location ...


Then, we are ready to go. So all we need to do is compile the solution or project by clicking the "Build" button (the arrow in green).


You can also build the solution or project using menu options, "F5" for the entire solution or "Control + F5" for the selected project.

Now, what result shall we obtain for our example? To see that, we pick one of the projects in the solution explorer and open its folder with the "Open In Explorer" menu option, say, the iPhone.


You will get as an asset file a copy of the image sized 320x200 texels. And believe me when I say that if we built the entire solution (or only the Windows project) we also get the image sized 640x400 texels as an asset file located in the path corresponding to the Windows project.

But this isn't all we get as the picture below shows:


The Ape mantains a repository on the corresponding path for our solution, but is also moves/copy the obtained files with the correct path structure to the location you specify as an output folder, adjusted for the corresponding compilation profile. Since we said we wanted all our asset copied, the "local" repository keeps a copy of all asset files. Otherwise, no local copies would exist.

Again, when we compile the solution we also obtain the asset for non selected projects -provided they are alos marked for build + copy/move actions.

Now, when your game expects a file with a specific format (other than public ones like png, jpg, wav, wmv, and so on so forth) and or a specific file extension, then the above example is not useful for you. Therefore, for the following example I'll use a Texture2D importer and a XNA-like writer for a Monogame project. But please remember that you can use the APE for any game engine not related to XNA/Monogame, provided that the corresponding import/write units are plugged into the editor.

So, allow me to introduce you to a nice feature that the APE has: clean solution/project.



Both operations are similar, being the only difference that cleaning the solution will clear the content of all projects. This operations must be used with caution since they will delete the entire content of the folders indicated in the corresponding lines and not only raw files and nested folders. So before pressing start we should check that the paths are right. You can always cancel the operation by clicking the cancel button or by closing the window -in case you need to. And you can indicate whether to clean one folder (the local folder or the output folder), both folders or none. This is indeed a handy and powerful feature, don't you think?

Ok, after cleaning the solution and adding the corresponding import/write units to the editor we are now able to reload our solution and once we get it, we need to get something like this:


In order to get it, first we need to select the only raw file we have on the Windows Project. Then, we need to select the category "Textures" (1) and finally select the importer named "Xna Texture Importer" (2), which in this case is selected by default. As you can see we get some information about the importer and, in this case, a couple of read-only properties ("Group Id" and "Is Source File Also Copied ?").

As a result of the above we can now realize in practice how the Group Id field on the Project tab can be used (3): a value of zero indicates that we want to target XNA/Monogame's HiDef profile and a value above it the Reach profile.

If we expand the panel with the properties for the 2D Texture Processor we'll get:


I won't explain each of the fields you see there. Some of them are similar to what we had on XNA's CP GUI but there are some additional ones -which I believe speak for them-selves. All I can say is that this is not a mock-up; in fact, it's a real processor in alpha stage currently consuming WPF's bindings for WIC. During the campaign at IndieGoGo I'll give more details about this.

After tweaking some of the parameters, let's have a look at the formatter:


Nothing to do here. The "Xna Binary Formatter" will give the proper format to the asset data but with one difference for those of us accustomed to XNA's CP (given the way I've implemented these import/write units): the XNA's header for the asset file is not included as part of the format. That will be added by the writer later on. Needless to say that you can design your import/write units so that the formatter includes the header instead of the writer, which is fantastic given that it shows how flexible the APE really is!!!

Ok then, and please again follow the numbers, before proceeding to compile the texture asset for the Windows project we'll need to get something like ...


First, we select the Window project.

Second, we need to set the proper writer for the compilation profile we are going to use (in this case, for "Debug").

Third, we need to select the "Normal" compression mode on the Writers panel instead of the "Automatic" one that is shown in the picture above. Why? Because I haven't implemented yet the so called LZX compression -which is a real pain, believe me- or any other compression for the writer.

I'd like to add a side note here: do you notice the data displayed on the writer panel for the Xna Writer? (default, version, etc.) In particular check the following three: "Writes checksum", "Requires password", "Encodes Data". These fields if marked as "Yes" indicate that the writer May or May Not do the associated task (depending on whether is activated manually/automatically as implemented by us), like calculating an writing an MD5/SHA checksum somewhere into the asset file, adding password protection and or encoding/encrypting the asset file. And if marked as "No" then the writer does not support that action.

Fourth, and this is really important: we need to save the solution!!! (this is required to apply changes)

Fifth, we need to indicate that we're going to build only a project (the selected one) for the "Debug" compilation profile.

Finally, build the assets by pressing the corresponding menu option, button or keyboard shortcut ("Control + F5"), being the result in the local repository the following:


This is a well-formatted xnb file with pre-multiplied alpha, a mip-mapped chain and its top-most texture resized to the nearest power-of-two value for 640x400 texels (I leave the answer to the latter as homework for you). Since this asset file is not LZX-compressed, its size on disk is rather high. Again, a copy of this file is located on the output folder that the XNA/Monogame solution uses.

Finally, if we run the game's solution with a line like the following within the method of the LoadContent operation ...


   this.texture = this.Content.Load<Texture2D>("textures/backgrounds/bkgClouds");
      

... and the corresponding one on the Draw method, as shown in the trailer for the APE, we'll get something like this on a WindowsGL-targeted game using Monogame (and please don't ask me "why do you need mipmaps for this example?"):


Enough said!

In the following parts we'll be getting into more technical stuff.

Cheers!
~Pete

Tuesday, December 10, 2013

THE ASSET PIPELINE EDITOR - PART 3

On part 2 of the series, I talked about some basic features of the APE, so it's time to show how to change the structure of a project and populate it with some raw files, that is, before starting to build assets.

I could begin this post by telling you how to create a new solution from scratch, but instead, I'll show you a nice small feature the editor has that allows you to open recent solutions, which spares you from having to use the Open Solution menu option ("Control + O") and browse folders until you locate the solution file.

Recent Solution List

Do you notice the small red (almost transparent) cross beside the only entry in the list of recent solutions? If you press it, you will be prompted to confirm entry's removal from the list. The cool thing is that APE does not need to restart in order to refresh the list. Ditto for "Clear All ..." option.

Once the solution is loaded we can start modifying the tree structure to meet the needs for our game. This is how the solution looks like so far:

The Solution So Far

Let's add a couple of nested folders to the Windows_Own container by selecting the "Add Folder" menu option (or clicking "Control + Shift + F") twice. Now our solution looks like this:


By default, folders will be created with the name "folder_<#>", so let's renamed them with a meaningful word (by double-clicking the node):


Better, right? But wait a minute! What if we want to have a similar structure on the default container? All you have to do is open a context menu for the node you want to copy (right-click over "textures") and you will be presented with this:

\
You have two options there: either you move or copy the node and all of its content to another non-nested location; in this case, to the Windows_Default container. Note: if you try to move or copy the "images" folder into the "backgrounds" folder the operation will fail.

Since we want to replicate the structure, we then select the copy option. Then, by opening the context menu fro the default container we will get the following:


By pressing paste, the new structure for our solution will look like this:


It's important to mention here than we could have obtained the same result by using drag'n'drop. Yes! Dragging the textures folder and dropping it onto the default container would have been allowed. In fact, when you drag'n'drop nodes in the solution explorer, the APE asks us what we want to do with them: move, copy or cancel.

Also, it's worth mentioning that, in this case, we could have copied the whole content of the self container ("Windows_Own") by using the context menu for the container. That is also allowed. The difference is that in this case the container it-self won't be moved or copied, just its child nodes.

Now, let's add some raw files. Shall we?

There are three ways to add raw files to containers or folders: (a) you select the container/folder and then use the "Add Raw File" menu option/button (or press "Control + Shift + R"), (b) you select the container/folder, open the context menu for it and click on the  "Add Raw File" option, or (c) you use drag'n'drop (you can import raw files into your solution by dragging them from the explorer and dropping them onto the container/folder).

In the picture below, you can see how to use the context menu for the target folder.


One important feature to mention here is that you can import more than one file at a time (batching) with any of the three ways the editor offers, including drag'n'drop.

For the example, we just added a nice cloud background to the backgrounds folder:


The picture above shows a lot of new information about the editor:

1. The name given to the added raw file is its filename (without the extension, if any). This is true provided that the name does not exist in the same "relative" location of the solution's structure, or else it will be renamed to "file_<#>".

2. On the container tab, the editor will show you the proper previewer for the file (remember that if none is found it will just show the system icon for it).

3. A new tab gets enabled: "Raw File". Here is were you tweak import settings for the raw file. The APE will try to assign an importer to it based on the file extension, giving priority to the importers marked as default (more on this on later posts).

4. The source file is copied into the "Sources" folder for the solution. This folder works as a repository for the entire solution, so you cannot have two source files co-existing with the same name on the folder since one of them will be replaced. This has nothing to do with the names you give to the nodes on the solution explorer (you can have two nodes with the same name in different "relative" locations on the solution tree).

In order to change the name of the raw file we could do it in the same way as before by double clicking on the node, but in this case we will use the "Name" field on the raw file tab and rename the node to "bkgClouds".

Now, before showing you how the structure of the solution currently looks like, let's add a new project for, say, the iPhone platform. And here it is ...


As you can see the default containers on both projects have the same exact structure; however, the self containers are different. In effect, that is the purpose of self containers.

Although the "bkgClouds" raw file is present in both default containers, it is treated as a deep copy; so mirror operations are bound only to add/remove/copy/move. For properties, like importer, processor and formatter-like settings, each node is independent, so you can have one set of settings for the raw file for the iPhone and a different set its sibbling on Windows. Really great!

Now, what if you want to import two different source files but keeping them under the same raw file on different projects? Let me translate it with an example ... say that you want to replace the source image for bkgClouds but only on the iPhone project with a lower_res version, well ... yes, you can! How? Simple. Look at the picture below:


On the Raw File tab there is field named "File" with a button labeled "..." (marked all in red), which shows the location to the source file in the solution. When you click on the button you can browse your folders until you get the file you desire.


In this case, we'll select the 320x200 texels version of the cloud image for the iPhone platform to get this window:


The APE will prompt us to confirm the operation and also it'll let us delete the existing source file if needed. Since this is not the case, given that we need both of them (one for Windows and the new one for iPhone) we let this checkbox unmarked.


Once the operation completes, the image previewer for the "bkgClouds" raw file on the iPhone project will show the new information. Now we have two different source files for the same logical name. And therefore, after building assets, when you load the "bkgClouds" texture on your game, you will get the 640x400 version on Windows and the 320x200 on the iPhone. No need to add "_640x400" or "_320x200" to the load operation.

Let's have a look at our solution's Source folder:

As you can see, the respository has both source files for the same raw file. And I guess that you now fully understand the difference between "source" and "raw"" file (if not, the former is the actual file and the latter its representation on the solution's tree).

Ok, before finishing this post, let me show you one more relevant feature the APE offers. For this task, say that you accidentally messed with the Source folder and deleted one of the images; for instance, the one ending in _640x400. What would happen? Well, when you re-open the editor, you will see the following:


The APE is indicating you that the solution is corrupted and also marks the path to the offending raw file. To fix it, you can either manually copy+paste the source file to the Sources folder or you can use the Raw File tab -as we did before- by clicking the "..." button beside the "File" field and browsing to the location of the source file.

In order to valdiate the solution you have to options, either you save, close and re-open the solution (in which case the APE will warn you that you are attempting to save a corrupt solution) or use the validate menu option ("Control + Shift + V") ...


Having finished the validation processed, the editor will look like this:


So, we can now safely continue to work normally with our solution and save it whenever we want.

Btw, you can browse to the folder where you solution is located by using the "Open In Explorer" menu item (ditto for projects) ...


And you'll get to ...


Do you notice the icon on "MySolution"? When the editor is closed you can open it and also load the solution by clicking its file. Indeed. The APE will open and load the solution for you.

Ok guys, this is all for this post. On my next post we'll talk about building assets.

See ya!
~Pete

Friday, December 06, 2013

THE ASSET PIPELINE EDITOR - PART 1

Now that the cat is out of the bag, I believe its time to start posting some details about the Asset Pipeline Editor (from now on, "the APE").

So, what is this tool? To answer it, let's travel through time to a point, say, four years ago or so.

To make the long story short, during the golden age of XNA I was in the search of an efficient way to replace the built-in content manager class with my own. So I created my own version of this class, but I was still using XNA's content pipeline.

Then, when MSFT announced that they would stop any development on the XNA Framework, like many of you, I remained captive of Visual Studio 2010 IDE for building xnb's. During this year, I decided not to wait any longer for miracles and start building my own content-pipeline replacement.

Let's face it! How many of you have been lately crying out loud for XNA 5? Or asking where the content pipeline has gone? You still need VS2010 to use the content pipeline if you want to use XNA, Monogame or ANX. And no definitive solution has been provided yet. And even if there is something in the works, is it worth for teams?

Be honest here. You guys know that the in game development a content pipeline is indeed needed, but why sticking to a particular IDE for programmers (like VS) or any other programming IDE in the first place? It could make sense for programmers only, but for teams, artists, or even for solo devs that want to have an independent process, it does not. In fact, it could turn out to be cumbersome.

So, that is why I developed the APE. To replace XNA's content pipeline "in spirit" (since it has key deifferences) for those inhouse projects where I wasn't using Unity, UDK or any other authorware with its own content management features.

I must admit that at first I wasn't expecting to get these results, since I was aiming lower, but as times went by, I realize that -and please allow me to say it- I was creating something really good. So I went on until I said "Wow!".

Now, about its key features:

1. Its not for XNA, only: it works for any kind of custom content, not only xnb's. In fact, if you have your own way to manage content when programming a game or provide solutions (like WaveEngine, for instance), you can use it safely since the tool ends its job when the asset its build and copy to the folder you indicate.

2. It's a highly customizable tool: not only you can tweak the editor to meet the needs for your project (to some extent, of course) but also you can define your own building process: import, process, format, writing. You dream of it. You got it.

3. It eases the task of managing game content: that's it, throughout the whole development process. If say, you just want to use the input files as is, like pngs, jpegs, and so on so forth, without any processing, you can because the editor comes with pass-through units. So you can still use the tool to manage which content goes where, even if there is no processing required.

4. It's independent from any programming IDE: this is heaven for artists! If you are a solo programmer working on a game, having to use VS to build content is fine, but when you're woring on a team with artists this is not good at all. The APE comes with a GUI of its own.

5. Test before you promote units: "why is my custom processor not working?". Say bye-bye to these kind of questions since you can test your custom "units" (as I call them) before using them with the editor. Indeed. You can build your own testing assembly for your custom content/process. And once you give it a green light you plug it to the editor as an add-on.

Here you can see the main editor as of today:


The areas indicated in the picture above are the following:

1. Solution Tree: create a solution, add a project for a platform, and you will be able to traverse all the nodes here. Projects can have two or more containers: self, default and partial. And you can add as many folders as you want to each container. In future parts I post more details about this.

2. Search Tool: if you need to see specific nodes of the tree this is a quick way to do it. It works recuresively over the last search, so you can go on reducing results until you find what you want.

3. Output Settings: for every "raw file" that you want to import, you can define how to process it and format it. And for each project, you can define how to write ("export") assets to disk. These are powerful tabs. More about them later ...

4. Build Tool: here is where you decide whether to build assets for the whole solution (all projects) or for a specific project; and also where you define the compilation profile: debug, release, test, ... you name it ... add the ones you need for the solution. You will be able to watch it on a how-to video, later.

5. Configuration Tabs: so far there are four: general settings, solution settings, project settings and container settings. Basically, you can tweak many properties there and even add/remove entries for platforms, profiles, default importers, default writers, to mention just a few. In a later post I will cover all settings.

6. Log Panel: classic in any respectable IDE; here is where the editor informs you about the state of a given process, whether it succeeds or fails, warnings, exceptions. The usual stuff. It's a real friend to understand what's going on. There is more to it than meets the eye, and that deserves another post.

7. Bars and shortcuts: well, this is not indicated in the picture above, but of course that you have menu items (both, as text and buttons) and key shortcuts for many features. Not to mention, contex menues, where applicable, for example to copy or move content. Ahhh, yes! I almost forget it, drag and drop is allowed.

I don't want to go beyond the scope of this post, but I cannot help adding the following picture:


Above there's an example of a test assembly. When you create your units for a specific type of asset- by the way, using C#, this is where you debug them and see whether everything works as expected. Add break points, switch over text, find offending code!

And finally, for this post: "previewers". When you traverse the solution tree and select a raw file (or source file), the editor shows information about the file as well as a previewer for it. By default, the APE comes with previewers for some image, audio and video formats. Or else it will switch to an icon previewer. But the good news is that if you need to preview more formats you can create your own previewers! What is more, you can replace all built-in ones if you want ....


The picture above is an example of a previewer for an audio file.

Now, before ending this post there two remaining points I would like to address:

1. Current State of the APE: ready for my inhouse projects: the units, besides the pass-through ones, are meant for my inhouse projects, only. Plus, there are some features I would like to add (and even some porting to do) before a public release, if you guys are interested in having this tool, so ...

2. What's next: my idea is to start a fund-raising campaign on IndieGoGo to make a first release of the tool,  and some extended goals like developing some units for XNA'ers and Monogamers (so they don't have to), and even doing some porting to other platforms like MacOsX and Linux (since so far the tool only works on Windows for .NET 4.5). And yes, the link at the end of the trailer is not working right now as the campaign is currently on draft mode.

So, this is all for now.

I hope this post sheds some light on what's the APE about and that you guys are interested. It's upto you guys to define whether this baby eventually sees the public light (if not, I'll continue to use it for my inhouse projects). Your call ...

'till next posts!
~Pete

Thursday, December 05, 2013

MY ASSET PIPELINE EDITOR ... FTW!

Hey guys, it's been a long time since I last posted something on my blog! So I decided to share a teaser trailer of a tool I've been working on during this year for my inhouse projects ...
... behold "The APE" !!

Sexy, right?

I guess you know the purpose of this beauty, but in case you don't, I'll be posting more details soon.

So, stay tuned!
~Pete

Tuesday, December 04, 2012

GREAT ARTICLE ABOUT C++

Today I happened to find a very interesting article about C++ on the Internet, entitled “Why C++ Is Not Back”.

The article is written by John Sonmez, a native coder, who “embraced the dark side” and gave C# and XNA a try.

Please read his article. It is worth reading every paragraph. All I will say about it is that I agree with him. What is more, imho, if C++ were replaced by D, we would all be currently using D.

On a side note, C# is indeed a great language, and once it gets a proper native compiler –and not a tool created to only improve startup times- it will rock.

Cheers!
~Pete

Monday, August 13, 2012

.NET MUST DIE … TO GO OR NOT TO GO NATIVE …

… is that the question? … not really.

From time to time I dare ask technical questions to experts in the fields of native+managed worlds so as to better understand the differences, performance-wise, between code originally written with a native language like C++ and “native images” of code written with a managed language like, as of today, C#.

Due to the novelty around the resurge of C++ due to revision 11, in one of my latest Q&A adventures, I dared ask Alexandre Mutel about eventual penalties –if any, of calling a wrapped operation in C# once the assembly gets compiled ahead of time with NGen (or its Mono equivalent, AOT compilation). Like, say, the following:

[DllImport("SomeDLL.dll")]
public static extern int SomeOperation(int h, string c, ref SomeStruct rStruct, uint type);

[For those of you that still don’t know him, Alexandre Mutel is the creator of, inter alia, SharpDX: “a free and active open-source project that is delivering a full-featured Managed DirectX API”, which is currently leveraging the DirectX-side of projects like Monogame and ANX, among others; being imvho the perfect choice for those of us who don’t want to go back to C++ and once embraced the old ManagedDX solution that then was called off by MSFT in order to give birth to XNA a few months later].

I won’t dare claim that Alexandre posted this impressive article because of my email question (or my prior request of DirectMath support in SharpDX due to SIMD), but I must admit that it vanishes any doubt I might have had in the past in that regard and leads me to concur that .NET must die.

In his article, Alexandre mentions an interesting detail, or fact if you’d mind, when speaking of a managed language:

… the performance level is indeed below a well written C++ application …

… and also that:

… the meaning of the “native” word has slightly shifted to be strongly and implicitly coupled with the word “performance”.

He also references two articles about the real benefits of better Jittering:

And a finding on Channel9 forums, indicating that MSFT is hiring to create a unique compiler to be used on both, C++ and C#.

So, after reading all of the above-mentioned material, if you have reached a point in you programming life where you do search for performance over safeness, is still the real question whether you should go native?

Imvho, the question has then turned into “how”.

The fact that a native solution gives the performance level you are looking for, does not mean that you must only use the C++ language. Even with the additions found in C++11 (a few of them that could have arguably stemmed from managed languages), it still has a cumbersome and unfriendly syntax.

Or what is more, does neither mean that you won’t be able to use a language like C# to get an optimized native application for whichever platform you need (even the Web).

If in order to get native bits we should always stick to “low-level” languages, then we had never moved from both Assembler or even binary notation towards C and all of its offspring. The evolution of hardware and compilers, made eventually C++ a better choice than ASM for performance-oriented apps, given that, marginally over time, the penalty curve was decreasing to an extent that it became irrelevant for native programmers.

Therefore, what if you can get rid of Jittering (being replaced by a fully performance-oriented LLVM compiler) and still have an efficient GC for cases when manual memory (de)allocations are not needed?

Much as I hate Objective-C, due to its ugly syntax, its newest versions for the MAC (and lately, the iOS) platforms offer LLVM native bits with GC.

And what about a more friendly language like “D”, instead? Latest evidence leads me to believe that C-based languages are moving towards its direction.

My point is that going native does not necessarily mean that all  the memory management of your program must avoid a garbage collector for efficiency. Nor that you have to use languages with cumbersome or unfriendly syntax to get the most of efficiency. It depends mainly on how compilers and memory management tech evolve side by side to get the most out of the target platform, how unsafe you can go with a given language where and when needed, and how much penalty-free you can call native operations from external binaries.

For instance, even though its limitations, you can do some unsafe programming with C# (fixed, stackalloc, etc.). The problem is that this feature is not allowed for all platforms (like WinPhone7), and in some platforms the set of operations is limited (i.e.: stackalloc is not available on the Compact Framework for the XBox 360).

And again, the D language seems to provide a friendly syntax (close to C#) while offering a power similar to C++.

Personally, I feel quite comfortable with C#; let’s be real here for a moment: I won’t be creating a Halo-like game any time soon, but I don’t want to go back to C++, say, to consume DirectX11 APIs. Having said that, I really hope C# evolves in a way that arguments from “native” programmers become trivial and the industry embrace it (as once embraced C/C++ to minimize the use of ASM). Evidence shows C# will evolve in this field, but as usual, time will tell …

To wrap it up, does going native imply that .NET should die so that a syntax-friendly language like C# would survive? …

Short answer: yes (or at least, as we know it today). Long answer: read all of the links provided in this post and see it for your self ;)

My two cents,
~Pete

Tuesday, July 03, 2012

IMPORTANT NEWS ABOUT WINDOWS PHONE 8

In the last few days there have been tons of news regarding the upcoming Windows Phone 8.

I am thrilled with two of them, especially:

  • The addition of new markets, and
  • Marketplace pre-compilation of apps.

New Markets

For years I have been sending comments and suggestions to Microsofties (including TPTB), so as to open the AppHub (both WP7 and XBLIG) to devs in countries outside the list of supported ones. Ditto for XBLive services and marketplaces.

As a dev living in one of the unsupported countries, it was quite frustrating that the only two ways to get your app/game to the AppHub were either opening a company in, say, the US, and or talking to a publisher. Being both solutions, cumbersome.

Recently, MSFT announced that for Windows Phone 8, this is finally becoming a reality. So, at launch, over 180 countries will be added to the WinPhone8´s Marketplace (consumers) and its related AppHub (developers).

Please, allow me getting it off my chest: FINALLY!!!

Marketplace Pre-Compilation

On March, 2010, I had submitted a suggestion to the XNA Team through the Connect site entitled “Native Image On-The-Fly” (edit: I am afraid it got lost in the last database purge, so the provided link won´t work).

My suggestion was pre-compiling all approved apps/games to be published on the XBLIG channel to native images. Given the similar architecture of the XBox 360 consoles, this should be a straightforward process to be done once per app/game on the server side, with low-to-none chances of image corruption as target hardware would not change, saving the customer from waits stemming from Jittering during execution.

The benefit, simple: faster start-up and running times of games since the console would be executing native images of assemblies instead of the (MS) intermediate-language versions of them (please note that memory would still be “managed”).

As a result of one of the new features introduced to WinPhone8 devices (that is, support of native code), there was no reason why the above-mentioned rationale would be kept away from the upcoming product.

Well, … MSFT also announced that the Apphub will introduce a new service: pre-compilation of assemblies.

So, if you build with managed code, the assemblies you submit to the AppHub will be compiled by the servers to native code before they make it into the marketplace.

Please, allow me again getting it off my chest: FINALLY!!!

Both news are a huge step forward. Let us hope MSFT eventually extend them to the XBox360’s AppHub …

Cheers!
~Pete