Showing posts with label Articles. Show all posts
Showing posts with label Articles. Show all posts

Monday, December 09, 2013

THE ASSET PIPELINE EDITOR - PART 2

Continuing with the second part of the series, and before writing about technical stuff (the coding side), let me introduce you to the editor it-self.

When the Ape starts, the editor looks like the picture below:

Main Window

By default, the only tab that is enabled is the one named "General", where you can configure general settings to control the editor's behavior. There are a few options that speak for them-selves, like "Automatic check for updates", so I will refer to others which I believe deserve a few words since they will affect each solution/project created in a cascade manner:

1. Default Main Folder: every time you create a new solution, this is the folder that will be shown by default on the Folder dialog. Think of it as the directory that will hold your solutions and projects.

2. Default Root Folder: for every project created, this is the relative root folder assigned by default to each container. For those like me who come from XNA-based games, this is usally set to "content".

3. Default Build Action: this is the action to execute when building assets from raw files in solutions/projects, which can be "Build Always", "Build If Newer" and "Do Not Build".

4. Default Copy Action: this is the action to execute when copying assets after a build action finishes on a project, which can be "Copy Always", "Copy If Newer" and "Do Not Copy". You will see later that this also refers to "moving assets".

5. Available Platforms: this is the platforms that a solution can support and you can add/remove custom platforms to the ones provided by default. The one marked with an asterisk "(*)" indicates the project that will be automatically added when a new solution is created.

6. Default Writers Per Profile: this control has two purposes; you can add/remove/modify custom compilation profiles as well as the write units assigned by default to each and every one of them; again, the compilation profile marked with an asterisk "(*)" indicates the one that will be selected by default when a new solution is created.

Add Compilation Profile Window

When you add a new profile, you can set its name, the relative folder where assets will copy/move when built, and the default write unit for it. We'll get to the concept of a write unit on a later post.

7.  Default Importers Per Category: every import unit corresponds to a category (that you can set when creating the importer), so what you can define here is which will be the default importer for each category. When you add a raw file, the editor will find the category it fits in and from there it will try to assign the default importer; if no importer is found, it will move to a default category and importer (pass through).

Note: both, import and write units are plugged into to the APE as add-ons, so when you execute the tool, the editor will automatically create the collection of available import and write units.

8. Save/Reload Settings: when you change general settings you have to options: either you save the settings for future use or you reoad settings previously saved; if no settings are found (or saved settings get corrputed) the APE will generate and save default settings for you.

Now, what are Solutions and Projects?

A solution is the root node in the APE's logical tree which will directly hold projects as its child nodes. In other words, is the root node that will contain asset projects.

A project is the node that will contain the structure that will be used to build assets for a specific target platform. You can have as many projects you want on a solution but only one per target platform.

When you build a solution, all the project it contains will be built (unless you specify not to build one or more projects). Conversely, when you execute the build action on a specific project, the other will remain intact.

In order to create a new solution, you can open the "File" menu and select the "New Solution" option, click on the "New Solution" icon on the main tool bar, or press "Control + Shift + N".

New Solution Window

When you do that, a pop-ip window will prompt you to enter specific information like the name to give to your new solution, the folder where the solution file will be saved, whether a subfolder with the name you enter must be created or not and the "Output folder".

The latter is quite important since it indicates the path to the root folder where all assets generated for the default project will be copied or moved. It works like this, say you have a Visual Studio solution for a game you are developing (or a repository for the assets, not the raw files- to be used); you browse to this path, you select it, and from there the APE will append at build time the relative folders corresponding to the selected compilation profile as well as to each container, when it corresponds.

The initial structure of a solution is the following:

Initial Structure Of A Solution

Being the only project created the one that corresponds to the target platform inidicated as the default one on the general settings.

As you can see in the picture above there are two more nodes that are created by default within the project: the initial containers ending in "_Default" and "_Own".

The former holds raw files shared with all projects. Thus, when you add a new raw file to that container, a new node will be also added to the default containers in all remaining projects, with the same name.

And the latter holds raw files assigned only to that project. So, this container is where you put raw files that you don't want to share with other projects.

Now, there is a third type of container: partial containers (to which I also refer sometimes as "shared" containers).

Add New Shared Container

Say that you have three projects in the solution (Windows, XBoxOne, PS4) and you want to share content on the first two (Windows and XBoxOne); well, in order to do that you create a partial container for the windows project, add the XBox One platform and presto!

To add more projects to your solution you either select the "Add Project" item on the "Solution" menu, you click the "Create Project" button or press "Control + Shift + P".

Add Project To Solution

Again, you will be prompted to specify the output folder for the new project.

Ok, moving on ...

When you create or load a new solution, you will see that other tabs will get enabled: "Solution", "Project" and "Container".

Many options in the Solution tab are similar to the ones found in the General tab; in fact, this is where you start to benefit from the cascade approach given that you have the possibility to override the default values given to each "shared" option.

Solution Tab

But there is one more relevant option you can set for a solution: "Move Assets". The only explanation I will post for now is that when you build assets, the APE save them on a specific repository for each project and then, copy or moves each asset to its final destination folder (the one located in the output folder for a project).

For projects, the logic is the same as before, tweak parameters to override default values coming from the solution or general settings.

Project Tab

Here you can also indicate:

1. Group Id: even though you can set different behaviors per platform, you can also indicate affinity among specific platforms in order to set the course of action for import/write units. For instance, for XNA-based projects, and id of zero ("0") would mean a "HiDef" profile meanwhile one ("1") or above would mean "Reach" profile.

2. Copy To: remember that for new solutions you had to sepcify the output folder for the initial project? Well, you will have to do the same for the new project. So, each project can have its own output path, which is strongly recommended so that you avoid the risk of overwriting asset files.

3. Default Writers: not only you can override the values that come from the solution or general settings, but also you can change the folder that will be added to the output path when building assets for a specific compilation profile (like Debug, Release, etc.).

Regarding the Container tab, there is no much you can do there other than change the root folder, which is initially set to its default value (usually, "Content").

Container Tab

For default and self containers, names cannot be changed. For partial containers, you can set a different name, and you will also get a list of the projects it shares content with.

There is another tab that is also activated when a solution is created or loaded ("Writers") but I will refer to it on my next post.

Before reaching the end of this part, I would like to refer to the search tool, which you can use when you want to find nodes in the solution that start with a specific character or group of characters ("Control+F").

Search Tool

The beauty of this control is that you can do a new search recursively, that is, on top a previous search. So if you first search for "Windows" and then for "myImage", you will get all the nodes (and its children) that start with "myImage" within the nodes that start with "Windows". By the way, the top-most node (the solution) is not considered for this feature, so the result it would have been the same if you had entered "my" instead of "myImage".

By pressing the "Reset Search" button or "Control+Alt+F" the whole solution is again displayed on the explorer.

This is a powerful tool and I'm planing to extend it with additional features in the middle run.

Well, that's it for now. On part 3 I will talk about adding raw files, compiling projects and a few more useful features available on the APE.

Stay tuned,
~Pete

Sunday, March 03, 2013

HOW “OPEN” CONSOLES LIKE THE PS4 OR 720 SHALL BE?

Recently, during the announcement of the Play Station 4, Sony made the promise of bringing the “most open console” for devs.

Discussions have taken place regarding whether by “open” Sony actually means that the new hardware architecture of the PS4 is somewhat more “familiar” than the one chosen for the PS3 console or that the burden to indies will be diminished to some extent.

The phrase is really interesting given that for years the access to dev programs for consoles have been and are still are surrounded by a plethora of red-tape (aka “security”) procedures so that as a rule -as opposed to as an exception, mainly (or if you prefer, only) big companies publish games on their “pro” marketplaces.

To be approved as a professional developer for whatever console, you need to demonstrate that financially you are backed up for the whole development and certification process (upfront fees, updates, deadlines, etc.), you must buy tailored hardware for production to have access to the corresponding SDKs, you need to prove that technically you are up to tough QA checks, and so on so forth.

In recent years, the opening of the XBLIG channel on the XBox 360 console may have seemed for many a change in direction, however, it ended up as a way to bring attention to the console without letting an avalanche of indies stain its reputation and even its status quo to an extent that could threaten the very monolithic model imposed to pros. Never wondered why only a few lucky devs made it into the XBLA channel?

One may argue that by upholding all of the above-mentioned requirements, console enterprises keep low-quality games away from their consoles. To some extent, back in the old days they could have, but today, generalizing the term “low-quality games” with low-budget games or indie games is getting more difficult to sustain (in many cases, such association is unfair).

Ok, but, if this business model has been working fine for console makers, what could make them change to truly support indies?

1. Apple kicked in: like it or not, Apple revolutionized the mobile market for smart phones, smart devices and tablets. But the story did not end there …

On an Industry where sagas were turning out to be far from innovative as well as rather repetitive, Apple was smart enough to truly open the door for indies and their fresh ideas, worldwide.

In fact, and in contrast from what MSFT did with XBLIG, Apple made no difference among devs. Everyone that pays the annual fee has access to the whole pack of features. It doesn’t matter if you are a pro or an indie. Is up to you whether to support any of Apple’s services in your game, develop your own or consume a third-party’s solution.

And every game, regardless its developer, must be tested and approved by a group of Apple’s employees.

All that, the fact that any game (pro or indie) could be picked to be featured on the “AppStore” as well as the growing amount of success stories among indies, contributed to position Apple’s devices on top of the list of target platforms for devs.

2. Google plays: following Apple, Google decided to enter the app-publishing market for Android-based devices, and with NaCl, for browsers.

The amount of devices running Android OS is growing fast and Chrome is strongly incrementing its market-share.

Google’s store, currently named “Google Play” is open for pros and indies, the list of approved countries is increasing, and many games that succeeded in other platforms were ported to Android.

Discussions aside whether you like iOS or Android, there is no doubt that Google Play has served as a model for other big firms, like Amazon and Samsung (with Chillingo), that decided to follow its lead and open a marketplace of their own.

3. The power of the Steam: there are many gamers -particularly hardcore ones- who claim that PC games have nothing to envy from console games. For those gamers, Valve introduced Steam as a means of digital delivery of videogames.

It first started on the Windows platform, then added MacOSX to recently support Linux OS. At the beginning only native-coded bits were allowed, but then Valve allowed managed code.

On recent months, Steam introduced its “Green Light” program, offering new publishing opportunities targeted for indies. In this program, the community decides which games should be granted a green light for publication. One example, is a well-known tower defense game named “Kingdom Rush”.

Steam succeeded where others failed, like Games for Windows, making its services attractive for many devs, globally. The buzz was so loud that now companies like Apple and MSFT have their own stores like the Mac App Store and the Windows Store.

Valve’s next move? The Steam Box console … “Piston” …

4. Secret Wars: imvho consoles as we know them are closer to get obsolete. Why? With current and potential customers buying more and more smart mobile devices over consoles, conditions are changing towards mobility.

As a matter of fact, there is a secret war going on right now among console makers to turn the experience of using a console into a whole multimedia one. Games, movies, music, applications, Internet, and whatever it fits the Cloud-Computing agenda, will be included in consoles.

But it doesn’t stop there … you will be also able to carry that experience with with you on your mobile smart devices. UserId synchronization over the Cloud plays a huge role here. Say that you were playing some game on your console but have to leave, don’t worry, you will be able to continue playing that exact game on your smart phone on the subway without having to start a new match; just resume it and presto! Ditto for movies, music play lists, etc..

Now, you could be asking what indies have to do with this. And the answer is simple, despite their budgets, small companies can react faster to changes in the environment; ymmv, but the larger the payroll and the infrastructure to sustain, the slower a company takes action to accommodate to new conditions on the market, since decisions on a big company usually involves many people on many levels of the organogram.

For each new device, indies would be most likely willing to take a chance. Big companies, unless an exclusivity contract is on the table or at least an exposure/marketing deal, they would wait to see what happens with the device (take the PS Vita for example).

5. Ohhh Yea!: thanks to a surprising campaign on Kickstarter the new upcoming console “Ouya” is about to join the market.

The console will target indies, but this doesn’t mean that you will be able to publish videogames without an approval process.

It’s very soon to foresee the fate of this Android-based console, but if Ouya succeeds, it will put a lot of pressure over traditional console makers, and who knows, all end up being quite positive for indies in the middle run.

So what can traditional console makers do to avoid oblivion?

For starters, traditional console makers must realize that profit is more and more associated to service-based models and less to classic business activities like setting high initial price tags for new consoles.

In this sense, the current monolithic model where dev companies are requested to buy special hardware for production and SDKs, pay high upfront fees, and crazy amounts of money for the QA of updates and patches, could rapidly become a huge stone on the console makers’ shoes.

The more number of games a console gets, the more the makers earn. Plus, they also get a profit from an annual small subscription charged to devs per platform. So the more devs a platform gets, the more profit the platform gets from subscriptions.

The beauty of this equation, is that the console maker always gets a profit from the games you sell and the subscription you pay, even if the game fails on popularity (and you don’t get money out of it). Multiply this income by a large amount of subscriptions and published games and you will have a winning strategy.

Make no mistakes here, this doesn’t mean that games shouldn’t be verified nor authorized before publication, but let the market itself do the ultimate quality-check on published games.

Now, there is an additional key issue to solve in order to assure a critical mass of devs and games. Both, indies and pros should be able to access all official services available for the platform with no distinction. As I mentioned before, on iOS devs can integrate leaderboards, achievements, social interaction, to mention just a few regardless their status as indies or pros.

Last but not least, implementing a proper videogame-exposure built-in system cannot be neglected on each platform. Usually, big companies can run ambitious/aggressive marketing campaigns, so they will likely get a spot on featured areas of the stores. So a way to attract indies is to expose their games as if they were made by pros. The 360’s XBLIG channel is THE example of what NOT to do!

To wrap it up …

The videogame market as a whole has changed big time in recent years. Smart devices introduced new challenges to both, PC and console makers. And thus, to stay in the game one may expect a strategy leap in the middle run on traditional consoles. Or else, these consoles may face an important drop in sells in comparison to previous editions.

How open a console may be deemed will therefore depend on, or if you prefer, will be directly proportional to the less barriers imposed on indies.

Let’s hope that both the PS4 and the XBox 720 consoles get really open to indies this time, by offering an attractive business model, worldwide.

Cheers!
~Pete

Friday, February 15, 2013

THE FATE OF XNA … NOW WHAT?

Lately there has been lots of speculation and comments on the Web regarding the fate of XNA as a result of these blog-posts.

Due to technical difficulties with my main system I am arriving late at the party; many articles and tweets are out now, but anyway, I will give my view on the subject.

For me, the phase-out process that MSFT has been carrying out silently for, what, a couple of years, a year and a half, a year, <include your estimate here>, is not precisely a surprise. In fact, I stopped working in all projects based on XNA tech during late 2010 because something was troubling me.

At that time, I was an XNA/DX MVP creating my game engine, replacing XNA’s Content Manager with my own version of it, developing a videogame, to mention just a few, but for some reason I was holding myself back before starting a game-dev business based on XNA tech.

The hunches -based on facts- that supported my decision back then,  in hindsight now prove me right on my wait. Of course it is important to note here that this worked for me; in other words, YMMV.

1. HUNCHES AND WARNING SIGNALS

Let’s see, in no particular order, these are the hunches that caught my attention:

  • Comuniqués started to slow down: these were a great read on the XNA Team blog, but suddenly, they started to fade out.
  • Our Community Manager moved to another division: we all remember her xxoo’s at the end of her messages and posts. That unexpected departure was the first warning signal to me.
  • XNA 4 was gradually presented as a “mature” product: or expressed in a different way, XNA was not likely to receive (major) updates. Maybe this one was very difficult to gather at that time, but for me it was the second warning signal.
  • Lack of strong support for XBLIG: how many times community members (and even MVPs) claimed for proper marketing, fast opening of new markets, and or even a decent location on the Dashboard? In practice, MSFT turned out to be reluctant, so third warning signal.
  • Lack of XBox Live services for XBLIG: in addition to the previous one, how many times community members claimed for Leaderboards, Achievements, DLC, and so on so forth? Do you guys at MSFT really expect that games with no global leaderboards survive the increasing demands from gamers?
  • Communication of Future Moves to MVPs: in the past, before entering a new dev phase, the Team used to involve XNA/DX MVPs on design decisions. Maybe for many readers this is not relevant, but from and MVP’s perspective that to some extent used to be involved in the roadmap, being asked “what do you guys think of …?” a few days before going public, is a warning signal. Fourth one, indeed.
  • The format of .xnb files was published to the world: this one might have been handy to me if published a couple of years earlier, but combined with the one below, gives -more than an indication- a confirmation that MSFT was silently phasing out XNA. Fifth warning signal.
  • Gradual relocation of all members of the XNA Team: when you saw one one of the most important programmers on the Team go to a different division on MSFT, and no one is relocated or hired to take its place for further development of XNA, (please be honest here) did you really think that everything was ok? Sixth warning signal. A major one, if you ask me.
  • Unattended suggestion on Connect: after the database clean-up the XNA Team did on its Connect’s page, suggestions were marked more and more as “Active”, “Postponed”, “By Design” and “Won’t be fixed”. Seventh warning signal.
  • DirectX SDK will not be updated any longer as such: let us clarify this point: the DirectX SDK was integrated into the Win8 SDK for the newest version of DX. What happened with the SDK for DX9.0c? Eighth warning signal.
  • No XNA 4 for Windows 8 RT: this is a technicality but, given that DirectX 9.0c does not get along with ARM processors, unless XNA gets a redesign based on DX 11.1, it gets pushed out of the picture for Surface (ARM-based) tablets. Since the XNA Team has been erased, unless a new official product comes unexpectedly out of the shadows for .NET, hoping for an official rope is kinda naive. Ninth warning signal. 
  • XNA does not support WinPhone8, or does it?: after all the worries, talks and efforts to provide safe environments, MSFT does radically change by allowing the execution of custom native code on the new Window Phone 8 devices. This sounded like heaven for XNA’ers until MSFT announced that XNA wouldn't add support for WinPhone8. Games created with XNA for WP7 still run on WP8 devices, but they will not be able to get advantage of unsafe operations for the device. Tenth warning signal.
  • XNA is not integrated into VS2012: as a corollary of the point above, XNA was not integrated into VS2012, what in turn means that if you need to use the content pipeline, you will need to install VS2010 side-by-side with VS2012. I don’t know, eleventh?
  • No MVP award for XNA/DirectX: I can understand the decision for XNA given that it has been and still is being phased out, but why must the award for DirectX be also doomed? Despite the fact that the SDK is now part of the Win8 SDK, imho it is still a separated kind of expertise that cannot be merged with other areas. Final warning signal = confirmation.

As a former XNA/DX MVP as well as an old timer using MSFT’s technology, let me say that lately it has been really difficult to recommend the use of XNA to create games professionally given the facts above.

What can you say to devs when they ask questions like: “Can I use XNA for Windows RT?”, “Will XNA be integrated into VS2012?” or “Will XNA support DX11?”? Ditto for the question below …

2. WILL THERE BE A NEW OFFICIAL SOLUTION FOR .NET?

It is very difficult to foresee what’s coming next in terms of .NET and game development given the difficulties one may find when trying to deduce what the heck TPTB at MSFT are currently thinking/doing.

But let us see, to update XNA (or replace it) MSFT may consider that …:

  • … there is a novelty around “Going Native” with C++11 inside MSFT itself.
  • … to support ARM processors, the new tech needs to be built on top of DX11 APIs (which supports “legacy” cards by only enabling the subset of allowed features for the card).
  • … XNA is neither a pure DX9-wrapper nor a game engine, making it difficult to justify its maintenance.
  • …  the dream of “develop once, deploy to the three screens” vanished given that not all the features supported on the PC were supported on the 360 and the WP7 platforms. Plus, the screens are changing: WP8, Surface, XBox.Next, ...
  • … due to the managed design of XNA, and in spite of some indie impressive efforts (like this one and also this one), XNA lacked middleware support of big fishes in the Industry.
  • … there was never a world/level editor. XNA is VS centric, so how can it compete with editor-centric solutions like Unity3D or UDK?
  • … last but not least, XBLIG failed as a business line an new lead marketplaces for indies emerge (Win8, WP8). Period.

So, to answer the original question, with C++ regaining position inside MSFT and being DX11.1 mandatory for latest platforms, why bother? Which leads us to the next question …

3. WHAT CAN “XNA’ers” DO NOW?

You feel disappointed. MSFT let you down (for some, again). You cannot find the exit from this nightmare. And you do not want to learn or get back to C++.

If that is your case, then, do not panic! Right now, there are many alternatives out there for you to consider, specially if you like or love C#:

1. SharpDX: created by Alex Mutel -as an alternative to SlimDX, this pure wrapper of DirectX (from DX 9.0c to DX 11.1, both included) has positioned as the lead solution for advanced user who want to program DX games on top of the lowest level available to C#.

Although this set of APIs is open source, it is consumed by many of the solutions that will be listed next. What is more, games for Win8 from MSFT Studios (through partners like Arkadium) have been developed using SharpDX (i.e.: minesweeper, solitaire, and mahjong).

Alex has been also developing a Toolkit to ease development of common tasks (sound familiar?), which for sure extends a bridge to those of us coming from XNA.

2. Monogame: the open source sibling of XNA. Fueled by SharpDX for all latest Windows-based platforms. Multiplatform not only for Windows, thanks to Mono.

With few-to-none modifications to the source code of your XNA creations, you can port your games to a wide variety of platforms.

This open source solution has recently reached its third stable version, adding many requested features, like 3D support.

Although it lacks a content pipeline replacement, which is currently under development, it can be used from VS 2010 and VS 2012.

Many well-known games have been created with Monogame (or adaptations of it) like: Bastion, Armed!, among others.

Last but not least, the community is growing strong around Monogame. As a matter of fact, if you like “the XNA-way”  then this is your perfect choice.

3. ANX: a competitor to Monogame. Its name, in case you did not notice, is XNA reversed. Recently, after a long wait, v0.5_beta has been published.

Not many games have been created with this solution yet and its community is rather small –in comparison with Monogame’s, but definitely its progress is worth following closely.

4. Paradox: I really do not know how Alex does to find some time left, but he is also developing a multiplatform game-dev solution for .NET with a data-driven editor!

Of course that the Window-targeted portion of Paradox is based on SharpDX, but the engine will also offer deployment to other platforms based on OpenGL.

No prices or release updates have been disclosed yet, but having read the features, watched images and demo videos, it is by far a very serious alternative to consider.

5. DeltaEngine: the lead dev of this multiplatform solution is the first XNA MVP that wrote a book about XNA.

Coding by using this solution resembles coding with XNA. It has its own multiplatform content pipeline which optimizes output per platform, among other tools. And games like Soulcraft show off the power of the solution.

You can check the pricing here.

6. Axiom: being a former user of this solution before the time of XNA, I am very pleased to see that the project has revived.

Axiom is now a multiplatform solution for .NET based on the popular OGRE graphic engine, which also consumes SharpDX for Windows targets.

Honestly, I do not know whether there are games created (and published) with this solution, but I hope there will eventually be sooner than later.

7. WaveEngine: Vicente Cartas (MVP for XNA/DX) has just let me know about this cross-platform engine, which will be released as a beta in less than a day ahead (thanks for the tip!).

Oriented towards the mobile-dev market, the engine is a result of a two-year effort of the Wave Engine team. Knowing past work of Vicente on JadEngine, I cannot wait to watch some cool demo videos here (like Bye Bye Brain).

Best of all, the engine is completely free, so it is with no doubt worth trying as soon as it gets released!

8. Unity3D: I cannot forget to mention Unity3D since it started almost at the same time that XNA did, however, adoption among devs grew exponentially on later years because of a combo of factors: a robust editor, multiplatform support, increasing number of appealing features, and a variety of well-known success stories among indies (for instance, ShadowGun).

Make no mistake here, the experience of using Unity3D is quite different from XNA’s: its editor-centric, coding -either in C#, Javascript or Boo- serves as scripts, sometimes you need to broadcast messages -as opposed to an OOP rationale, and last but not least, 2D programming is not straightforward (not even on the latest version; you need to buy one of the available plugins as a workaround).

You can check the pricing here.-

As you can see, even if no official solution will replace XNA, its spirit remains in many of its successors, all of which support latest DX11 HW.

So imho as a dev, there is no need to worry. Your knowledge is still valid for the above-mentioned alternatives.

4. OK, BUT WHAT ABOUT MSFT?

Well, imho it would be deemed as positive by XNA’ers (and indies in general) if MSFT …:

  • … does not try to impose C++ as the only language to develop quality games.
  • … develops a common compiler for C++/C#, for all supported platforms.
  • … implements SIMD ops for .NET (please vote for it).
  • … reduces differences for .NET among the latest “screens”.
  • … supports open efforts like SharpDX and Monogame (it seems it will).
  • … publishes as open source the source code of XNA that does not implies a security risk or bring any potential legal issues to the table (like say, the content pipeline).
  • … reduces barriers for indies (like say, the access to XBox Live services) for the upcoming XBox.Next so as to compete with other platforms like Ouya, iOS, Steam and so on so forth.
  • … and continues to support indies through initiatives like the Dream.Build.Play compo.

Personally, I do not care the language or solution a dev picks to develop a game provided it is the right language or solution for the project. In this sense, this “Going Native” campaign that some people at MSFT may seem to support by stressing perf differences among C++ and C# whenever they can, is imho unnecessary given the fact that there are many successful indie games out there developed with managed code.

Plus, as a former C++ dev, I do not want to get back to C++ because I feel really confortable with C#. If sometime in the future I had to go to a lower level language I would prefer “D”.

Thus, I hope MSFT creates a common compiler for C++/C# which in turn will help us turn the use of hybrid solutions into a common scenario for indies.

5. TO WRAP IT UP …

Without starting a nonsense discussion for a Pyrrhic Victory, imho the fate of XNA was predictable if you took a careful look at announcements from MSFT, whether you deemed them as facts or mere hunches.

But one thing remains strong for sure: XNA’s spirit.

Thanks to solutions like SharpDX and Monogame one can still talk about C# and XNA-based coding as a valid option for a game-dev business.

Cheers #becauseofxna!
~Pete

Friday, September 28, 2012

RAZER’S LATEST LINE OF PRODUCTS

On a year of trends to go mobile with a variety of offers to pick from, I must admit that I get sometimes amazed with alternatives that reinforce the desktop world.

One of these alternatives is this keyboard by Razor:

Man, I love the design and the concept. Some specs:

  • 4.05” touch screen able to run widget apps.
  • Track pad with gesture support.
  • 10 dynamic display keys with 80hz response time.
  • Chiclet style key caps.
  • Tri-colour backlit keys.
  • 1000Hz Ultrapolling.
  • Fully programmable keys with on the fly macro recording.
  • Razer Synapse 2.0 enabled.
  • Dedicated Gaming mode.
  • 5 additional macro keys.
  • Anti-ghosting capability for up to 10 simultaneous key presses.
  • Braided fiber cable.
  • Fixed wrist rest.
  • PC with USB port.
  • Windows 7 / Windows Vista / Windows XP.

This is really a nice, handy and interesting product. If only it supports the upcoming Windows 8 and Surface tablets it would far from perfect.

The only drawback, like in any top-notch product long before it becomes standard, is its initial retail price -currently at U$S 249.99- which may result prohibitive for many.

I would love to see future editions of this keyboard where all its keys get dynamic-displayed.

As a side note to desktops, another nice product to mention are Razor’s Blade notebooks, which integrate the above-mentioned keyboard solution, flawlessly.

Again, everything goes well until you see the prices, which start beyond U$S 2 thd., without VAT and shipping costs.

I cannot wait to see what Razor will come up with next; don’t you? Not to mention to get one of these, if I ever get to afford buying one.

Cheers!
~Pete

Thursday, September 13, 2012

NO MORE GAME DISCS, PLEASE !!!

Today, prices and availability dates have been unveiled for the upcoming Wii U console.

Among its specs it is mentioned that game discs for this console will have a 25GB capacity while the internal HDD storage will be either 8GB or 32GB.

I don’t know whether you think the same, but I believe it is time for the game industry to move away from CDs and DVDs.

In spite of the improvements some consoles have introduced in order to avoid scratches (like the XBox 360), it’s really annoying and frustrating when a game disc gets eventually scratched on a relevant area for the game to properly run.

So, as we wait for a full switch to the Cloud nirvana, why not replacing discs by other hardware like flash drives? Nowadays, a memory stick can have large capacities.

I’m not talking here of empty flash drives that you can buy to then plug it into the console to save downloadable games, but drives already prepared and commercialized by publishers containing the game.

Imagine a flash drive with one read-only memory area (where the first version of the game is stored) and a protected memory area for patches (I will leave game content out of this picture, for now). It would be like going back to the cartridge era with a modern twist.

Smaller box-art and more portability for games not commercialized through the Cloud are some of the additional benefits.

So let’s hope devs of next-gen consoles -like the XBox 720- embrace this thought …

Cheers!
~Pete

[Btw, this could be also applied on laptops, notebooks, ultrabooks and so on so forth]

Thursday, August 23, 2012

INTERVIEWING DEAN DODRILL, CREATOR OF “DUST: AN ELYSIAN TAIL”

What’s the dream of game developers like myself? To get an opportunity, even the slightest one, to publish your own game title on the big leagues. That game that you always dream of creating from scratch. Your masterpiece. Your 9th Symphony …

Some of us, generally “indies”, even dear dream of watching that game become a success once it goes gold. The kind of success that allows us to officially become part of the Industry from that moment and on with a critical mass of loyal gamers waiting for our next tiles with sincere smiles of joy on their faces.

For Dean Dodrill, creator of the acclaimed game entitled “Dust: An Elysian Tail”, the dream has become a reality.

For those of you who still don’t know, Dean’s game (Dust: AET) was the grand-prize winner of the Dream.Build.Play contest held back in 2009. Recently, the game went gold on the Live Arcade marketplace  for the XBox 360 console (“XBLA”), as a part of the Summer of Arcade 2012 promotion (“SoA”).

As soon as the game got released, it received (and still does) lots of positive reviews, articles, kind words from buyers, a zizillion of tweets and FB posts, and ratings varying from 8.5 to a perfect-10 score.

Most of them, like this interview with the guys of IGN’s  Podcast Unlocked (which I recommend listening to), focus on the story behind the creation of the game and the game itself, from the perspective of gamers.

So, since the game was fully developed with C# and XNA, being a strong supporter of both technologies for years, I decided to try luck and interview him from the perspective of an indie XNA'er.

Well, … guess what? Dean kindly answered all of my questions, so be prepared to read his responses after watching the launch trailer of his game.

Ok, we’re back. Before posting the interview I want to thank Dean publicly for accepting the interview and taking the time to answer all the questions.

Now, enjoy the reading …


Q: Are sells going as you expected? I mean, I don’t want to know the figures; instead, I want to know whether they have reached a point where you can continue developing games professionally (you know, to continue living the dream) or not (= it contributes to make you family’s life better for a while but it is not enough to go beyond).

A: It’s a little early to determine how well sales are going, but I do believe the game will allow me to continue game development, at least at the scale I’m currently working on.

Q: How was it like using the XNA framework –from and artist viewpoint, given your lack of programming experience? (I mean, pros and cons) Which features do you love for C#/XNA to have built-in (I mean, that lacking feature that forced you use a workaround or take a programming detour)?

A: Since I’ve never programmed with anything other than C#/XNA I can’t really compare it to other languages. I will say that I found it fairly easy to pick up, and since much of my code resulted in some sort of visual feedback in the game, iteration was fun. Garbage collection on the 360 was always a hassle, but I loved many XNA specific niceties, such as SpriteBatch and streamlined gamepad support. I also love XACT and how relatively easy it was to work with audio and effects. If I could help I would continue working with XNA exclusively.

Q: Which features of the extended XNA APIs for Live Arcade did you use? Again, how was it like using them?

A: I did have to use the XNA extensions for XBLA, and admittedly most of that was a hassle. The biggest issue is that most of it is poorly documented, and there were always certification issues which were inherent of XNA. Integrating Leaderboards and Achievements was one of my least favorite parts of the process. I definitely got the feeling that XNA wasn’t created with XBLA in mind.

Q: Given that you got a contract with MSFT, do you still own the IP rights of Dust: AET? Did you receive financial advantages, like not having to pay for (re)certification) and or dev/test kits? (if you can comment on it, of course)

A: I do own the IP to Dust:AET, but of course have signed an exclusivity period with MS. MS helped with testing and localization, and assigned me an excellent producer who helped push the game through the system (as well as offered valuable design suggestions). It’s a mutually beneficial agreement, otherwise I can’t go too much into details.

Q: Are you planning additions/extensions to the game? Say, now that you met the deadline for Summer of Arcade, you want to add that feature that stayed behind and would have loved to develop for the release by “unlocking some extra time”?

A: I haven’t given much thought to anything like DLC. Thankfully I didn’t have to cut anything to meet the SoA deadline, it was just a matter of compressing the schedule down and working VERY hard for a few months. Given more time I would have liked to polish a bit more, but that’s the curse of any project I’m sure. I do have plans for future games in this universe, but nothing to announce at this time.

Q: Are you planning to port the game to other MSFT platforms like WinPhone8, Win8 and the Surface? (for instance, by using Monogame or ANX).

A: MS and I haven’t discussed anything outside of XBLA at this time. I was honestly so busy focusing on the XBLA release that I hadn’t considered a port. If anything pops up I’ll be sure to announce it, but XBLA remains my focus as of this writing.

Q: Thanks in advance for reading, your response and for such a great XNA game which serves as a great inspiration for us, indies.

A: Thanks for the interview, Pete.


Game Description:

Immerse yourself in a gorgeous hand-painted world on a search for your true identity. As the mysterious warrior, Dust, your action-packed journey will take you from peaceful glades to snowy mountaintops and beyond. At your disposal is the mythical Blade of Ahrah, capable of turning its wielder it into an unstoppable force of nature, and the blade's diminutive guardian, Fidget.

Battle dozens of enemies at once with an easy-to-learn, difficult-to-master combat system, take on a variety of quests from friendly villagers, discover ancient secrets and powerful upgrades hidden throughout the massive, open world, and uncover the story of an ancient civilization on the brink of extinction as you fight to uncover your own past.

  • Take control of Dust, a warrior searching for his true purpose, as he joins forces with the mystical Blade of Ahrah and its guardian, Fidget, to save the world of Falana from an army unlike any before it!
  • Explore an incredible hand-painted world!
  • Match wits and weapons against challenging monsters!
  • Take on side-quests from a cast of colorful, fully-voiced characters!
  • Craft dozens of items and discover Falana's rarest treasures!
  • Compete against your friends' high scores in ranked Challenge Arenas!

Nice interview, don’t you think?

Not only is the game fantastic but also it may help developers finally understand how powerful C# could be when coupled with a fine tech like XNA, despite unavoidable performance differences with native bits, when you use the tech right even as a one man band (like in Dean’s case).

It’s a pity that many devs (pro and indie) still deem XNA as a tool for kids and hobbyist, only, and don’t give it an opportunity. And what is worse, it’s a shame that MSFT –at least for what is publicly known so far- won’t update it any longer

To wrap it up, Dust: AET shows off a quite enjoyable gameplay, incredible art as well as the mechanics behind its 2D environment, like skeletal animations, particle systems, input combos, shaders, to mention just a few.

So, what are you waiting for? Go and buy it now!

Cheers!
~Pete

Monday, August 13, 2012

.NET MUST DIE … TO GO OR NOT TO GO NATIVE …

… is that the question? … not really.

From time to time I dare ask technical questions to experts in the fields of native+managed worlds so as to better understand the differences, performance-wise, between code originally written with a native language like C++ and “native images” of code written with a managed language like, as of today, C#.

Due to the novelty around the resurge of C++ due to revision 11, in one of my latest Q&A adventures, I dared ask Alexandre Mutel about eventual penalties –if any, of calling a wrapped operation in C# once the assembly gets compiled ahead of time with NGen (or its Mono equivalent, AOT compilation). Like, say, the following:

[DllImport("SomeDLL.dll")]
public static extern int SomeOperation(int h, string c, ref SomeStruct rStruct, uint type);

[For those of you that still don’t know him, Alexandre Mutel is the creator of, inter alia, SharpDX: “a free and active open-source project that is delivering a full-featured Managed DirectX API”, which is currently leveraging the DirectX-side of projects like Monogame and ANX, among others; being imvho the perfect choice for those of us who don’t want to go back to C++ and once embraced the old ManagedDX solution that then was called off by MSFT in order to give birth to XNA a few months later].

I won’t dare claim that Alexandre posted this impressive article because of my email question (or my prior request of DirectMath support in SharpDX due to SIMD), but I must admit that it vanishes any doubt I might have had in the past in that regard and leads me to concur that .NET must die.

In his article, Alexandre mentions an interesting detail, or fact if you’d mind, when speaking of a managed language:

… the performance level is indeed below a well written C++ application …

… and also that:

… the meaning of the “native” word has slightly shifted to be strongly and implicitly coupled with the word “performance”.

He also references two articles about the real benefits of better Jittering:

And a finding on Channel9 forums, indicating that MSFT is hiring to create a unique compiler to be used on both, C++ and C#.

So, after reading all of the above-mentioned material, if you have reached a point in you programming life where you do search for performance over safeness, is still the real question whether you should go native?

Imvho, the question has then turned into “how”.

The fact that a native solution gives the performance level you are looking for, does not mean that you must only use the C++ language. Even with the additions found in C++11 (a few of them that could have arguably stemmed from managed languages), it still has a cumbersome and unfriendly syntax.

Or what is more, does neither mean that you won’t be able to use a language like C# to get an optimized native application for whichever platform you need (even the Web).

If in order to get native bits we should always stick to “low-level” languages, then we had never moved from both Assembler or even binary notation towards C and all of its offspring. The evolution of hardware and compilers, made eventually C++ a better choice than ASM for performance-oriented apps, given that, marginally over time, the penalty curve was decreasing to an extent that it became irrelevant for native programmers.

Therefore, what if you can get rid of Jittering (being replaced by a fully performance-oriented LLVM compiler) and still have an efficient GC for cases when manual memory (de)allocations are not needed?

Much as I hate Objective-C, due to its ugly syntax, its newest versions for the MAC (and lately, the iOS) platforms offer LLVM native bits with GC.

And what about a more friendly language like “D”, instead? Latest evidence leads me to believe that C-based languages are moving towards its direction.

My point is that going native does not necessarily mean that all  the memory management of your program must avoid a garbage collector for efficiency. Nor that you have to use languages with cumbersome or unfriendly syntax to get the most of efficiency. It depends mainly on how compilers and memory management tech evolve side by side to get the most out of the target platform, how unsafe you can go with a given language where and when needed, and how much penalty-free you can call native operations from external binaries.

For instance, even though its limitations, you can do some unsafe programming with C# (fixed, stackalloc, etc.). The problem is that this feature is not allowed for all platforms (like WinPhone7), and in some platforms the set of operations is limited (i.e.: stackalloc is not available on the Compact Framework for the XBox 360).

And again, the D language seems to provide a friendly syntax (close to C#) while offering a power similar to C++.

Personally, I feel quite comfortable with C#; let’s be real here for a moment: I won’t be creating a Halo-like game any time soon, but I don’t want to go back to C++, say, to consume DirectX11 APIs. Having said that, I really hope C# evolves in a way that arguments from “native” programmers become trivial and the industry embrace it (as once embraced C/C++ to minimize the use of ASM). Evidence shows C# will evolve in this field, but as usual, time will tell …

To wrap it up, does going native imply that .NET should die so that a syntax-friendly language like C# would survive? …

Short answer: yes (or at least, as we know it today). Long answer: read all of the links provided in this post and see it for your self ;)

My two cents,
~Pete

Thursday, June 21, 2012

WINDOWS PHONE 8 … YOU MEAN 7.8? NOPE …

Microsoft is letting a very interesting group of cats out the bag these days. First the news of its tablets (Surface) and now the news related to a new phone: Windows Phone 8.

You may be probably thinking that is not a new phone after all but a whole rebrand of an existing phone packed with a future update. If that is the case, you would be wrong. So wrong in fact, that you missed the lines that explain/claim that:

  1. The phone will be bundled with a new OS, which require a more powerful hardware than the one existing on WP7 devices,
  2. Windows Phone 7 devices will NOT be receiving this new OS but an update (to version 7.8), which will help to reduce the UX-gap between phones, and
  3. The new devices will allow devs to use native code and do native calls. Yes, C/C++ …

Let us talk about each of these points for a moment. Shall we?

(I) The new OS

To keep the story short: if Windows 8 will be out soon, what else could you expect?

Now, the long explanation …

Microsoft is trying to extend the success around the XBox 360 console –yes, despite the RROD problem- to other platforms in order to unify the user experience on the multimedia front to compete with others big players for their marketshare, like Apple and Google. And the new OS is a step in that front.

A broad set of hardware will be powered by the Windows 8 OS in the near future (PCs, tablets, …, consoles?), which in turn will help MSFT to position as a strong provider of a unified multimedia experience. So, providing a new OS branded with the number “8” for window phones is something one could have been expecting some time ago.

Will its strategy succeed? Well, that leads me to the discussion of the second point.

(II) The “Old” Phones

In my previous post, I briefly mentioned the doom of a device that seemed promising by the time it came out: the Zune HD.

Let me be clear here. The first Zune devices had nothing to offer against their “i” counterparts. But that was not the case for the Zune HD: neither in hardware, nor in software.

The Zune HD device opened the door for the Windows Phone and also influenced, to some extent, the look’n’feel of both, the current 360’s dashboard and the UI of the “8-based” OS. And yet, it was left behind in the dust …

Now, if the “old” 7 phones will be receiving an update in order to let users experience a taste of what the new OS will offer, why mentioning what happened to the Zune, then?

Because users (customers and devs) could deem this behavior as a tendency, as if the were treated as mere beta testers of MSFT’s experiments with mobile hardware: Zune, Zune HD, the infamous Kin, developer WP7 devices, and eventually retail WP7 devices.

Not to mention that this may represent a slap in the face to one of its newest and major partners, Nokia, which recently released into the market a new line of WP7 devices!

One can understand that a new OS may require new hardware to enjoy the full set of features it might offer, but rushing and or pushing things into the market this way, in a track-record of “no-more-support” deeds in a short period of time, could only become eventually a winning move if MSFT shows a strong commitment from now on to support its upcoming devices, for a reasonable minimum number of years.

Make no mistake, I am a MSFT supporter, but it does not prevent me to chime in and do a wake-up call when I see a warning signal.

As usual in life, time will tell …

(III) Native Code

“Developers, developers, developers …”

Allowing native code on a Windows Phone is a fantastic move!!!

But stating that by allowing native code it would be easier to port existing frameworks to the windows-phone environment is, imho, vague. It depends on a combination of factors: how many platforms you want to support, the techonologies you use to develop apps, and or how sensitive you are to fragmentation of code (since this feature will not be available to WP7 apps), to mention just a few.

Now, does this mean that you must use native languages for WP8? The answer is “No”. You can still use managed ones like C# (specially if you want to create apps and games for both, WinPhone 7 and 8 devices).

Does this mean you can still use XNA to produce games for WP8 devices? In spite of the fact that more and more it seems that XNA will be no longer updated –and no other official managed solution would take its place, the answer is “Yes”.

However, the answer for XNA-based games seems to be “No” for other platforms like the new “Surface” tablet and Windows 8 on ARM (on desktop mode you could still use it), unless you switch to unofficial solutions like Monogame or ANX.

Personally, I do not care whichever languages/techs a dev picks to develop apps and or games. But I do care about using the same languages/tech for the most amount of target platforms out there to economize resources. That is why solutions like Unity3D are so popular these days …

So I hope MSFT eventually returns to the dream of “The N Screens”, allowing for all its devices with the less amount of key differences (that is, on what can be used/called/consumed), both, native languages and also native calls from managed code, so that devs can pick the right combination for their needs and or preferences …

To wrap it up, some may be in favor, some may argue about pros and cons, and some may complain, but let us be honest and recognize that interesting times lay ahead.

The only news left behind –at least, for now- is related to the rumored XBox 720. But who knows? Maybe we will receiving some official words from MSFT sooner than we expect …

Cheers!
~Pete

Friday, September 10, 2010

GETTING A REFERENCE TO A METHOD-CALLING OBJECT IN C#

Ok, after taking a month off from any blogging activity, I guess is time to catch up and blog again. And I believe there is no better way of doing so than to write an article explaining a way to get a reference to the object that calls a specific method.

For the last two years or so, I have been working hard on a new Content Manager replacement API for the XNA Framework to use on my own “never-ending” game engine (I will blog about this later on). In some early stages of development I found a way to solve this referencing burden that have been creating headaches to many devs, globally (being me, one of them, for a long time).

Although I must admit that at some point on the development cycle of my API I decided to abandon this workaround, after reading one of the articles mentioned by Shawn here (guess which one), I believe it’s a great opportunity to talk about it.

A little history … before founding this solution I tried everything: from attempting to use –with no luck- the info held in StackFrame instances and even delegates to finally “quit” and use a naive approach as passing the caller as a parameter in the signature of an operation. But, then came .NET Framework 3 and extension methods to the rescue!

Please bear with me that the example presented in this article is just a simple one to show how to use it, but believe my words when I say that it can be extended to complex scenarios.

Having said that … if you want to know which object is calling a certain method of a class like (which may or may not be sealed):

  public sealed class MyCalledClass
  {
    internal bool JustReturnTrue(int myParameter)
    {
      return true;
    }
  }

Being the caller, say, a specification of the following class:

  public abstract class MyCallingClass
  {
    public void Print()
    {
      Console.WriteLine("I am the object which is just about to call
         the monitored operation named 'JustReturnTrue' …"
);
    }
  }

Then all you have to do is to implement the following generic extension method for the types you want (in my example, “MyCallingClass”):

  public static class MyExtensionMethod
  {
    public static bool ExecuteOperation<T>
      (this T mycallingClass, int myParameter)
      where T: MyCallingClass
    {
      mycallingClass.Print(); // Replace it with your own stuff.
 
      return new MyCalledClass().JustReturnTrue(myParameter);
    }
  }

The trick here is to design and split assemblies and namespaces in a way that all instances of “MyCallingClass” cannot directly execute the “JustReturnTrue” operation (I leave that task as an exercise to the reader).

But there is one catch to watch closely, though. By doing this you are actually adding one more call (you know that), which is not generally a problem on Windows, but for the XBox 360 and all devices using the Compact Framework, it could turn out to be expensive if used lots of times on intensive or heavy tasks/loops.

But if you really need it when speed is not an issue or for operations where -for example- you need to set owners to something “automagically” behind the scenes and later assess whether an owner calls a restricted operation before executing it, then there you have it!

Just use it wisely …
~Pete

> Link to Spanish version.

Sunday, July 26, 2009

HOW TO FIX LOCAL ISSUES TO CONNECT YOUR XBOX 360 TO THE XBOX LIVE SERVERS

A few weeks ago I had to reset my router because Windows Vista was giving me problems with my desktop’s wired LAN & WAN connections (note: problem fixed when I upgraded to Windows 7).

The thing is that when I re-configured my router using the file with the last-saved settings, my XBox was having problems when attempting to connect to XBox Live’s channels (both, arcade and indie). In fact, from time to time I used to receive messages saying my connection got lost due to errors 80072741 and so on (as you will see in a moment, I just forgot to persist to disk the proper configuration values of my router. Sigh!).

When that happens, and assuming that the XBox Live Team isn’t working on the servers(testing, updating, etc.), then something wrong must be happening on your side (like in my case).

So, what could be possibly wrong?

  1. Your router is broken,
  2. Your router has a faulty/corrupt firmware,
  3. You’re connecting your XBox 360 using a faulty wire,
  4. Your 360 cannot retrieve a local IP address from the router,
  5. Your router’s firewall is preventing your 360 from connecting to the Internet,
  6. Your router is performing some strict or moderate Network Address Translation tasks (NAT),
  7. You get some weird error messages and or lose connection when playing some games online on multiplayer mode, and
  8. Other connectivity problems.

If one, a few or some of these happened to you, then maybe the following tips could help to solve the issues with your connection. Meaning? No solution guaranteed.

SO USE THESE TIPS AT YOUR OWN RISK!

Now that you were warned, read on carefully …

Your router is broken.

Buy a new one in case it’s not easy or worth repairing. In the meantime you can try to connect directly through your (DSL) modem.

Your router has a faulty/corrupt firmware.

Go to the manufacturer’s support page, download the latest firmware for your router’s model, and update it (first, read your router’s manual to find out how to do the update).

You’re connecting your XBox 360 using a faulty wire.

Just change the latter and try again.

But what if I’m using a wireless connection? Then check your router’s wireless settings, like, say the security method and password.

Your XBox 360 cannot retrieve a local IP address from the router.

First check your network settings on your XBox 360: whether you want to get a dynamic or static IP address, the values for primary and secondary “Domain Name System” (DNS) addresses, the IP address of the gateway, etc.

Now, check on the router the maximum number of connections allowed at the same time. Maybe you are already using all of them.

If the router is currently providing a “Dynamic Host Configuration Protocol” service (DHCP), then any computer, console and or LAN/WAN device may be configured to attempt to get an IP address from the router, dynamically.

This should work fine with your XBox 360 console for supported routers, but in case it doesn’t, just configure the console to get a static IP address.

Which one? Well, simply put, an IP address that you know other systems won’t normally use (for instance, if you have two computers and your console and you are allowing, say, 8 connections, then set the last one as the static one for your console and you’ll probably do just fine).

In order to set a static IP address, on the 360’s Dashboard browse to “System –> Network Settings –> Edit Settings”, and then enter:

  • The static IP address for the console,
  • The Subnet mask (same than the one set in the router),
  • The Gateway IP address,
  • The Primary and Secondary DNS addresses.

Try the connection again, and if everything goes well, your 360 now should have access to the LAN.

Your router’s firewall is preventing your XBox 360 from connecting to the Internet.

Having access to the local network doesn’t mean that the router has also granted access to the Internet. Sometimes, the firewall of your router stops any attempt of your device and or a set of IP addresses to reach the Internet.

If that is the case, then check all security rules set on your router. Most routers allows you to specify either each IP you want to allow access to the Internet, a range of IP addresses and even your devices’ “Media Access Control” addresses (MAC).

Since at home, in my case, the number of devices that connect to the Internet is low, I just specify each MAC address and presto!

It may also happen that Internet access is only available to certain days and hours per day. So you should check those rules, too.

Your router is performing some strict or moderate NAT tasks.

This is one of the most popular issues when an XBox 360 console attempts to establish a secure connection with the XBox Live’s servers.

In short, not all ports and protocols needed to establish an optimal communication are (properly) set.

Ok … uhh … what?

Do the following: test your 360’s connection to the LAN, the Internet and finally Live’s servers, and if you get the result that two out of three work fine, being the latter the one that “partially” fails, then your router’s NAT functionality is not “Open”.

In fact, if that is the case, you can connect to Xbox Live but the connection is not optimal for cases when you want to play, chat, talk and even accept a friend’s invite online.

In order to configure the ports and protocols needed to establish a “sound” connection to the XBox Live services, you either:

  • Place your console in the “demilitarized zone” (DMZ), or
  • Manually configure the specific values using “Port forwarding”.

Note: in order to do one of these, you must first set a static IP address on the console.

DMZ means, in short, that you open all ports and protocols in order to communicate to a certain device with a specified IP address. So your device is placed inside an unsecured zone or if you prefer an unrestricted area. DMZ is too risky!

The alternative (the one I prefer): to manually set only the pair of ports & protocols actually needed for the connection for a certain device with a static IP address (in this case, your 360’s IP address).

All you have to do in the latter option, is selecting the “Port Forwarding” tab in your router and the setting something similar to:

  • Applications name (say, “XBox360Live”),
  • Each port range (“from 80 to 80”, and so on),
  • The accepted protocols (UDP/TCP/Both),
  • The local IP address (that is, your 360’s static IP address value), and
  • Check “Enable” (or whatever option you need to activate the rule).

Read this article in order to know which ones you must set.

As I said at the beginning of this post, the above-mentioned issue was the one preventing my 360 console from connecting to Xbox Live, properly. In short, the configuration file I had saved long ago as a backup didn’t include these settings. Fixed!

Now, continuing with the topic …

Optional: some recommend (I don’t) that when you receive the error code 8007274c, unchecking an option similar to “block anonymous Internet requests” on your router’s firewall may help. Plus, in some cases, clearing your console’s cache (warning: doing the latter will also erase all software updates! You will have to load the updates again) and or verifying that the proper DNS values are set.

By the way, port forwarding only works for one application at a time, which means that if two applications on the LAN attempt to get access, say, to the Internet using the same port, a conflict occurs and if your router cannot resolve the situation, connectivity gets affected … maybe your 360 is one of the devices in conflict!

If your router’s firewall has a log, check it to see which device and application is the source of the conflict. You can also try to check the log of your devices’ respective firewalls –if any.

If you cannot identify what’s causing the port conflict, I guess that turning off all devices but the 360 console, should fix this connectivity issue, so you can play some games online again.

You get some weird error messages and or lose connection when playing some games online on multiplayer mode.

If you do have a valid Live Gold Membership, then this is somewhat related to the port issue.

Some games need a few ports open for certain protocols (TCP/UDP/Both), which differ from the ones listed here.

Again, if you don’t want to set (the IP address of) your console on DMZ, then you should manually set both, port plus protocol, for that game.

Ok, how can I know which pair should I set? Well, you can either do an Internet search to find out or visit sites like http://www.portforward.com/, which have a lot of info in this respect, for a list of routers and services (including 360’s connections).

Other connectivity problems … plus fixes?

Say, your router must support a minimum MTU (“Maximum Transmission Unit”); in case of XBox Live that is: 1364. Or your ISP’s DSL modem is not good enough (request a change).

It would be great to know both, the problem you experienced when connecting your 360 to XBox Live and, of course, the fix.

Well, this is it. I hope you find this info useful.

Enjoy your games!
~Pete

> Link to Spanish version.

Friday, July 17, 2009

XBOX LIVE DASHBOARD UPDATE PREVIEW

Many websites are reporting details about the upcoming “Autumn Update” for the XBox 360’s Dashboard.

There are really great features being added in this one like, among others:

For a full list of the new features, you can read this article.

Cannot wait!
~Pete

> Link to Spanish version.

Sunday, July 05, 2009

“XNAVATARS” - PART 2 – NO SHADOWS?

As promised, I’m hereby posting the second part of the series about using Avatars with XNA GS on the XBox 360.

If you remember, on my first article I showed how to draw an animated avatar on screen taking into account transitions between two animations.

In this part, I will talk about one factor that will help you improve a little bit the eye-candy in your game when using avatars: shadows.

As you may know, the Avatar’s rendering system does not allow us to use a custom shader effect to render an avatar on screen; instead, we must only use the built-in system to accomplish that task.

On one side, this simplify things but on the other, it limits our possibilities a little bit.

Casting shadows is an example of this limitation. As I will show you in a minute or two, a “cheap” workaround can can used, for simple games.

The technique I’ll use is known as “Flat Shadows” (the XNA Framework includes all we need for it). It is a rather basic substitute for “real” shadows, but it will do the trick just fine for projects that don’t require “picky” shadow effects.

We will be using the project I had included last time as a starting point and mainly focus on the code to add and or modify.

1. Fields to add:

private Matrix[] transitionTransforms, shadowTransforms;
...
private Plane plane;
private float lightRotation;
private Model floor;

What’s new? The ‘shadow transforms’ array will store the matrices we need to flatten the model based on a reference plane, which we also define.

2. The constructor:

...
// Create the array of matrices that will hold bone transforms for shadows.
this.shadowTransforms = new Matrix[ 71 ];
...

Nothing fancy here. Just create the array that will hold the collection of matrices to flatten the model.

3. Initializing the game:

/// <summary>
/// Allows the game to perform any initialization it needs to before starting to run.
/// This is where it can query for any required services and load any non-graphic
/// related content.  Calling base.Initialize will enumerate through any components
/// and initialize them as well.
/// </summary>
protected override void Initialize()
{
    this.plane = new Plane( Vector3.Up, 0 );
 
    // As usual, initialize all compoenents.
    base.Initialize();
}

We just create the reference plane with a normal facing up and without moving along that normal (so it’s a XZ plane where the Y coordinate is zero, initially).

4. Loading content:

...
// Set the "World" value wit a rotarion of 180º.
this.avatarRenderer.World = Matrix.CreateTranslation(
    Vector3.Right * -1 + Vector3.Up * 0 + Vector3.Forward * -1 ) *
    Matrix.CreateRotationY( MathHelper.Pi );
...

We just modify the line that places the avatar in the world.

5. Updating the game:

We will only add this line:

...
// Update the value used to rotate the light.
this.lightRotation += .5f * (float)gameTime.ElapsedGameTime.TotalSeconds;
...

So the light will rotate to show the effect.

6. Drawing the avatar:

/// <summary>
/// This is called when the game should draw itself.
/// </summary>
/// <param name="gameTime">Provides a snapshot of timing values.</param>
protected override void Draw( GameTime gameTime )
{
    // As usual, clear the backbuffer (or the current render target).
    GraphicsDevice.Clear( Color.CornflowerBlue );
 
    // Create the array of bone transforms for the floor and populate it.
    ModelBone[] transforms = new ModelBone[ this.floor.Bones.Count ];
    this.floor.Bones.CopyTo( transforms, 0 );
 
    // For each mesh in the floor model.
    foreach(var mesh in this.floor.Meshes)
    {
        // Get the basic effect.
        foreach ( BasicEffect effect in mesh.Effects )
        {
            // Set values and commit changes.
            effect.DiffuseColor = Color.LightSteelBlue.ToVector3();
            effect.View = this.avatarRenderer.View;
            effect.Projection = this.avatarRenderer.Projection;
            effect.World = transforms[ mesh.ParentBone.Index ].Transform;
            effect.CommitChanges();
        }
 
        // Finally, draw the mesh.
        mesh.Draw();
    }
 
    // Can we draw the avatar?
    if ( avatarRenderer != null && currentAnimation != null )
    {
        // If we can, is the animation in transition?
        if ( this.isInTransition )
        {
            // If so, draw it with the interpolated transforms.
            this.avatarRenderer.Draw(
                this.transitionTransforms,
                currentAnimation.Expression );
        }
        else
        {
            // If not, draw it with the actual transforms.
            this.avatarRenderer.Draw(
                this.currentAnimation.BoneTransforms,
                currentAnimation.Expression );
        }
 
        // Make the light sources of the avatar dark.
        Vector3 ambientColor = this.avatarRenderer.AmbientLightColor;
        Vector3 lightColor = this.avatarRenderer.LightColor;
        this.avatarRenderer.AmbientLightColor =
            this.avatarRenderer.LightColor =
                -10 * Vector3.One;
 
        // Enable alpha blending.
        GraphicsDevice.RenderState.AlphaBlendEnable = true;
        GraphicsDevice.RenderState.SourceBlend = Blend.SourceAlpha;
        GraphicsDevice.RenderState.DestinationBlend = Blend.InverseSourceAlpha;
 
        // Change the depth bias just a bit to avoid z-fighting.
        float sourceDepthBias = GraphicsDevice.RenderState.DepthBias;
        GraphicsDevice.RenderState.DepthBias = -0.0001f;
 
        // Set the new light direction.
        this.avatarRenderer.LightDirection = Vector3.Normalize(
            Vector3.Right * 7.5f * (float)Math.Cos( lightRotation ) +
            Vector3.Forward * 15.0f * (float)Math.Sin( lightRotation ) +
            Vector3.Up * 10.0f );
 
        // If the avatar is stepping over the floor, then move the plane 
        // according to the "altitude" of the avatar in the world so as
        // to calculate and cast shadows in the correct world position
        // (also, take into account that in case of a "jump" movement, in a 
        // "complete" shadow system you must reposition the shadow along the 
        // floor taking into account the place where the light-ray hits the 
        // floor while it points to the avatar; otherwise, it will stand still 
        // as if the avatar never jumped in the first place).
        this.plane.D = -this.avatarRenderer.World.Translation.Y;
 
        // Calculate and set the world transform that will flatten the 
        // avatar's geometry, taking into account the original rotation,
        // scale and translation factors.
        Matrix world = this.avatarRenderer.World;
        this.avatarRenderer.World *= Matrix.CreateShadow(
               this.avatarRenderer.LightDirection,
               this.plane );
 
        // Is the animation in transition?
        if ( this.isInTransition )
        {
            // If so, draw it with the interpolated transforms.
            this.avatarRenderer.Draw(
                this.transitionTransforms,
                currentAnimation.Expression );
        }
        else
        {
            // If not, draw it with the actual transforms.
            this.avatarRenderer.Draw(
                this.currentAnimation.BoneTransforms,
                currentAnimation.Expression );
        }
 
        // Reset all affected values.
        this.avatarRenderer.World = world;
        this.avatarRenderer.AmbientLightColor = ambientColor;
        this.avatarRenderer.LightColor = lightColor;
        GraphicsDevice.RenderState.DepthBias = sourceDepthBias;
        GraphicsDevice.RenderState.AlphaBlendEnable = false;
    }
 
    // The following is used to show some statistics and other info
    // on screen. It can be omitted (or optimized).
    this.spriteBatch.Begin();
 
    // No need for further explanation.
    this.spriteBatch.DrawString(
        this.font,
        "Press 'A' to force changing animations or 'Back' to exit.",
        new Vector2( 50, 25 ),
        Color.White );
 
    // No need for further explanation.
    this.spriteBatch.DrawString(
        this.font,
        "Press 'B' to change the type of selection : " +
        ( this.moveRandomly ? "RANDOMLY" : "IN ASCENDING ORDER" )
        + ".",
        new Vector2( 50, 55 ),
        Color.White );
 
    // Draw the animation pointer, whether we are processing a transition and
    // the current transition time. Please notice that in this implementation
    // when the current animation is about to end (that is, 1 second or less),
    // the pointer "currentAnimationId" will change even if the animation is still
    // the same, so you will see a different number and name during 1 second or so.
    this.spriteBatch.DrawString(
        this.font,
        this.currentAnimationId + " : " +
            ( (AvatarAnimationPreset)this.currentAnimationId ).ToString() +
            " (" +
            ( !this.isInTransition ? "no transition" : this.transitionProgress.ToString() + " processed" ) +
            ").",
        new Vector2( 50, 85 ),
        Color.White );
 
    // Draw the current position and length of the animation being rendered.
    if ( currentAnimation != null )
    {
        this.spriteBatch.DrawString(
            this.font,
            "Processed " +
            this.currentAnimation.CurrentPosition.ToString() +
                " of " +
                this.currentAnimation.Length.ToString() +
                ".",
            new Vector2( 50, 115 ),
            Color.White );
    }
 
    // Flush the batch.
    this.spriteBatch.End();
 
    // As usual, call the base method.
    base.Draw( gameTime );
}

Here’s where most changes occur; the game ...:

  1. ... draws the avatar as on my previous example,
  2. ... changes the values of the lights that affect the avatar,
  3. ... adjusts the depth-bias to avoid any eventual z-fight,
  4. ... rotates the light and adjusts the plane altitude before flattening the model,
  5. ... updates the position of the flattened model using the static method of the matrix struct, which is named in the code as “CreateShadow”,
  6. ... draws the fake shadow, and finally ...
  7. ... restores the values of the lights and position of the avatar’s model meshes.

7: Changes to transitions code: none.

If everything goes fine you will see something like this:

Things to notice, though:

  1. Shadows will be drawn even when there’s no elements to cast shadows on (see how part of the shadow is rendered beyond the “floor” for a short period of time),
  2. You will have to modify the location of the shadow when the position of the avatar changes its height (i.e.: if jumping),
  3. The model’s meshes are flattened onto a reference plane, so it will only work for objects on that specific plane (like, in my example, a floor), and
  4. Thus, there’s no self-shadowing.

A more precise approach would be extending this example by trying to use the stencil buffer and depth data, as explained at the end of this thread.

Well, this is it for today. You can find the source code for this example here.

Cheers!
~Pete

> Link to Spanish version.