During the last months I’ve been involved in an infrastructure project. The idea is to offer on-demand resources. Think Jenkins or GitLab or any render queue. In my case, users are working from different countries and time zones. This is one of the cases where building a web-based front end makes sense.
The challenge: I’ve never built anything mid sized on web, only micro solutions that needed close to zero maintenance and were extremely short-lived. To make things more interesting the backend was offering its services through gRPC.

A note for other tool programmers

This is a piece about my second project using react. The first one, even if functional, was a total mess. I’m not suggesting the approach contained here makes sense to everyone but it has worked for me and I think keeping it documented has value.

The main issue with web-stuff for me is the amount of thingies you need to juggle to build a solution. To name just a few, this project contains: javascript, react, Babel, JSX, gRPC, Docker, Python, CSS, redux and nginx. It’s surprisingly simple to drown in all that stack.

Starting: react-admin + tooling

I needed an IDE for Javascript and I didn’t want to consume any license from the web team. So I started with Visual Studio Code. Coming from an overbloated VS Pro the difference in speed and responsiveness is remarkable. Adding the javascript support was also quite simple using a Code plugin. Below it, I had a common npm + node installation. For heavier environments Jet brain’s WebStorm IDE is what the professionals around me are using more frequently.

From that point a simple:

1
2
3
npm install -g create-react-app
npm install react-admin
create-react-app my-lovely-stuff

will get you started. You can see a demo of react-admin from marmelab team here:

With all that in place, how to start? After checking with more experienced full-time web devs they recommended me to use react-admin (RA from now) as a starting point. Later I realized how much RA’s architecture will impact the rest of the solution. But as a starting point it is great. The documentation is really good, I learnt a lot from it. From the get go you’ll have a framework where it’s easy to:

  1. List, Show detail, Edit and delete flows
  2. Pagination
  3. Filtering results
  4. Actions in multiple selected resources
  5. Related resources and references, aka: this object references that other thing make navigation between resources, simple.

Half way during the development I found out about react-hooks. I strongly suggest to watch this video, well worth the time I put into it:

I used only only a fraction of the potential Hooks offer and that was more than enough. The resulting code is leaner and more expressive. If I need to write another web using react I’ll try to squeeze more from them.

RA is based on a large number of 3rd party libraries. For me the most important 2 are:

  1. React-Redux: I use it mainly in forms and to control side effects. Some of the forms I have in place are quite dense and interdependent.
  2. Material-ui: Controls, layout and styles. According to what I’m seeing around lately it has become an industry standard. Out of the box is going to give you a Google-y look and feel.

Unless you’re planning to become a full time web developer I don’t believe it’s particularly useful to dig too deep into those two monsters of libraries. But having a shallow knowledge of the intent of the libraries can be quite useful.

gRPC in the browser: Envoy + Docker

The backend was serving its data through a gRPC end point and was being built at the same time I was working on the frontend. One of the main concepts of gRPC is the .proto file contract. It defines the API surface and the messages that will travel through it. Google et. al. have released several libraries to consume gRPCs (based around that .proto specification) in many different programming languages including Javascript, .NetCore or Python.

But the trick here is that you can’t directly connect to a gRPC backend from the browser. In the documentation, Envoy is used to bridge those. In other scenarios it’s possible to use Ambassador if your infrastructure supports it.

Since the backend was under construction I decided to write a little mock based on the .proto file using Python. Starting with the .proto file I’m returning the messages populated with fake but not random data. The messages are built dynamically through reflection from the grpc-python toolset output. The only manual work needed is to write the rpcs entry points than are automatically forwarded and answered by the mock.
Once the fake server is written you still need to make it reachable from the web browser. It’s here where docker-compose made my life way simpler. I wrote a compose with envoy and my server connected and I had a reliable source of sample data to develop the UI. In this case I was lucky since my office computer is running on a Pro version of Win10 making Hyper-V available and the Docker toolset for Windows machines have improved a lot lately.
It’s perfectly possible to achieve similar results using non-pro versions of Windows or even simpler by using a Linux or Mac desktop.

This small solution turned to be quite important down the line given the amount of iteration the backend went through. In the web world there’re many great API / backend mocking solutions based on REST calls. But when you’re working with gRPC the ecosystem is not as rich (or I didn’t found anything mature at that moment)

Other lessons

One of the interesting side effects of using RA is the impact of the dataProviders abstraction. The whole architecture orbits around classic HTTP verbs. At the end most of my code beyond some specific layouting and extra forms was pure glue. I have full translation layers in place: from gRPC into Javascript objects and vice versa.

In my domain and due to API restrictions I was getting different categories of resources through the same gRPC points. After thinking a bit about it the simplest solution I found was to implement pre-filtered data providers and give them resource relevant names. In other words I ended with a collection of data providers that were internally pointing at the same gRPCs but with relevant names. This allowed me to offer meaningful routes while keeping the UI code isolated from the backend design.

Containers, Docker in my case, are becoming more and more important as I go forward. If you know nothing about them I strongly suggest you to put some time in them. It can be a game changer. Even if your intent is to keep your dev environment as clean as humanly possible.

Comment and share

DICE’s summer party

Following a well stablished tradition, DICE celebrated the arrival of summer organizing a great party. They rented a great place the House under the bridge. Built under a tall highway bridge over the lakes with nice and informal environment.

This party remind me to the ones arranged by EA Madrid’s team. Colleagues formed bands and performed for everyone. Was good fun, including arts and crafts. Had a really nice time.

Meeting old friends

It’s a busy summer visit-wise. We reunited with old colleagues and went everywhere around town. We covered the mandatory visits and then some uncommon corners here and there.

And, on top of everything we had the chance of hangout with this german hunk. Lovely dude.

Something I never thought I’d do was to visit Skansen during Midsommar. I particularly enjoy Swedish traditional songs. And yes, we danced like little green frogs.

Improving life a little

The last months we put a lot of effort into improving our apartment. We renovated the place and started buying new and hopefully better appliances. Let me introduce you to our new vacuum cleaner!

s a vacuum cleaner.

A lovely bag-less machine that’s able to deal with cat hair and looks a little bit like an Autobot. While we were looking for models to buy we decided to check youtube for suggestions. We discovered that there’s a ring of Scandinavian youtubers that compare models and do all sort of field tests on these machines. It was fascinating. Never thought anyone could get so excited about cleaning carpets.

And one last thing. We went to a live recording of No Such Thing as a Fish.

A comedy podcast around trivia and curious facts. The podcast is funny and I recommend it quite often.

Cooking, expanded

During our time in Poland I discovered Vindaloo and truly liked how violent that dish could be. But when reading about it in more detail I discovered that it’s not supposed to be poison. It’s supposed to be vinegary. So I tried my hand at cooking it:

Some lessons learnt: careful with the veggies or you’ll end with a soup. A tasty one but that’s not how the dish is supposed to go. Also, sweet tamarind is not the same as cooking tamarind, it was my first time trying this, it’ll get better next time.

Some weeks ago we were lucky enough to get invited to a nice Spanish get together. Since we’re that fancy we brought some cinnamon buns and some traditional pickled herring.

The recipe couldn’t be simpler, even if you start from raw fish. I discovered later that not everyone loves pickled herring, it’s almost like almost no one does. If you look with attention you’ll see the cinnamon rolls just before baking.

Also I decided to buy a crockpot to my parents. Quite a normal one, but it seems that it’s a hit these days. Makes their days simpler.

And one last thing! A big grocery store opened very close to our place. It seems their plan is to specialize in imported foods and they have a Polish section. We were missing the Polish goodis so much.

If I have a recipe pending, that I want to master, that’s Bigos. A Polish dietary nuclear bomb. In other words: it’s phenomenal. Don’t get intimidated with the different meats you need for it, just follow Cheff John’s advice:

That happens to be one of the best YT cooking channels I know about.


My intention is to write a more technical entry … thoon.

Comment and share

I had a handful of pretty busy months. For starters we’ve returned to Sweden. Back in the mother land. Here you can see us mingling with the locals in the faithful Corner. In any case nothing will ever eclipse the glory of the Sports Bar back in Warsaw.

Our timing was perfect and I rejoined DICE during the final dev months of Battlefield V It’s a gorgeus game. I’m truly looking forward to try with some peers back at home.

Swedish things

Due an strange planetary alignment we had a number of super traditional Swedish events. I went to my first crab fish event. Including silly hats and duck face.

And a couple weeks later, we attended a wedding. The venue was at the shore of a beautiful lake and we had a terrific time.

Everybody had a blast and we danced to a couple ABBA songs too many. The Swedes have it in them.

New adventures in cooking

During our time in Warsaw I grew fond of YT’s cooking shows. And thanks to Mr. Sexy-Lips Adamo I discovered “the hot ones.” A pretty entertaining show that woke up my interest in spicy foods and sauces.

While I was walking around the Old Town I found a little British food store that has an extremely promissing collection of mean-spirited sauces:

… needless to say we’ve stocked quite heavily.

One of the biggest cooking discoveries (for me) that I did lately is that you can bake an omelette, so I’ve been experimenting with this approach a little bit. For instance here you have a pic of the super-meaty minced meat + bacon approach:

The lady has a collection of swedish cooking books dating to the mid 70s. One of the books is made of solid gold: Recipes from Swedish old days

where I found out a lacquered goose recipe that I truly want to try. I might give it a go if the lovely dudes from CDP decide to pay a visit to the far North.

Comment and share

During the last 16 / 18 months I’ve been working primarily with Microsoft technologies on the Desktop. A big lump of: WPF + OpenXML + Entity Framework. In other words: big stacks, massive code bases and tons of hours trying to understand what is going on under every:

1
2
3
using( var context = new DbContext() ) { 
var stuff = await context.Thangs.Where( w => w.Foobar < 3 ).ToListAsync();
...

.. block in my code.

I felt a little bit saturated. I wanted a project on the side, something interactive. And that’s how I found godot an open source game engine, an all-in-one package.

Getting engine + tooling

This game engine was born around 2007 and it’s been in development since them. The project got a MIT license at the begining of 2014. The mainline today is on the 3.0.5 version and yes, there’re versions for Mac + Linux. And just to make things even simpler, you can fetch a precompiled godot from Steam. It doesn’t get simpler than that.

It’s also possible to build the engine, that includes the tooling, from code, even though it’s not the simplest distribution system I’ve seen. The “Compiling” documentation includes several step by step guides that worked well for me.

If you’re working under Windows you’ll notice that he size of the .exe is around 20MBs. That’s all, that includes both the environment and the runtime. The editor, opened looks like this:

If you’re interested in testing the game in the image, you can try to play it in a browser

As usual if you’re planning on releasing in different targets, like iOS or Android, you’ll need the SDK and the size may vary. At the moment there’s no official support for consoles.

Learning Godot engine

An interesting way of approaching this technology, is to check some projects. Luckily there was a game jam hosted in Itch.io: godot temperature game jam quite recently and the projects submitted are interesting to play and check. It’s possible to download the sources and build the games by yourself, most of the titles I checked host the sources in github.

Godot architecture and code base makes it well suited for teaching and starting in gamedev. It’s possible to devevelop new behaviors using the internal language GDScript.

It’s also relatively simple to find YT playlists covering the basics of the engine, one example, found in Game From Scratch’s YT channel, could be this one: Godot 3 Tutorial Series

I know there’re a number of online courses, in the shapes of Patreon’s + Online Uni’s, etc. But I don’t know enough about those to have a clear opinion or.

Meanwhile, in the world

And now for something completely different: while I was deep inside one of Microsoft’s tech stack the guys’ve been busy and we have new nice and neat toys:

Blender is looking better than ever and it’s approaching 2.8 at the wooping speed of a second per second. Perhaps this video could help you catch up:

.. fantastic work.

Cyberpunk 2077 has a new trailer after years of silence. There’s quite a lot to write about CD-Projekt, timing, marketing, and whatnot.

.. but for now, it’s enough to say that I might have some part on the behind closed doors demo in 2018’s E3.

Battlefield V seems to be, somehow, advancing in time and the team travelled from WWI, into WWI + 1, or, in a trailer:

.. which, as usual, looks espectacular.

Comment and share

During the last weeks I’ve got the request to write some documentation of the localization tech stack I’ve been working during the last 18-ish months. In the team I’m working with nowadays, there’s a group of specialized documentation writers. Tech writers.

And when you check the docs they create, it’s clear they’re professional. Unified styles, neutral English, linked documents, different sorts of media including images, gifs, videos, links to code, examples in the game … everything you can imagine. It looks and it is costly.

And that works well for teams of some size. Let’s say sizes over one person. I’ve been driving aboslutely every aspect of the stack by myself: DBs / Caching / Services / UI / Exchange formats. On two very different projects at the same time. Starting from scratch. It’s been a blast. But it’s a messy blast.

How it should look, for me

When consuming documentation I want 2 sources of information:

  1. As a final user of the stack. What does the user see? How does the UI work? Which are the metaphores deployed?
  2. High level architectural view of the code base. Server based? Service based? Local user only?

… and, once what the intent is clear and the language with the user base is defined, then, if possible, show me some unit cases. Nothing fancy or spectacular something to start tweaking here and there.

That would be the gold standard.

Then, obviously it’s better when the code is not rotten. But that’s a daily fight. And a different discussion.

So what’s next?

Umh, after the E3 mayhem, maybe I’ll be able to convince some producer to redirect the work of some peers at QA to work with me for a couple weeks, and we’ll go together through all the insane nooks and crannies that one-man-operations tend to generate at these scales. If I’m lucky this person will be able to create some end user documentation and we’ll discover some easy points for improvement.

Meanwhile, obviously, I have even more stuff to develop, including a nasty data migration, related with a deep change in our domain.

Oh, the good ol´times when I believed that running Doxygen and flee was enough.

Comment and share

I was worried about the performance of our Database Servers. Our access patterns are mostly read-only, so why not cache the data we need in an intermediate server? Redis appears to be a good solution.

Too many readers, few writers

From a data life cycle point of view, my current domain has the following characteristics:

  1. It evolves by big chunks and the number of users allowed to make changes on it is very limited.
  2. There are hundreds of concurrent users on read-mode.
  3. It’s not mission critical for the consumers of the data to be perfectly up to date. They can wait some minutes.
  4. My budget is close to nil.

I didn’t want to route the readers of the data to the main DBs. That’d create the perfect bottleneck. And I’ve been looking into caching all that information, in memory, for a couple weeks.

Theres quite a lot of solutions out there. Microsoft has a couple: Velocity or AppFabric Cache. But in the Linux world there are way more options. But at the begining I was lazy and silly and I wanted a full Windows stack.

First approach: memcached

Memcached is one of the veteran solutions in this endeavor. It’s incredibly stable and Facebook (among many other) has been mantaining it for quite a long time. Here you have a chat by the man himself.

It’s pretty rare to have scaling problems that compare to FB’s. So I decided to take a look. There are at least 2 major versions of this solution that are precompiled for Win32. They work. But everybody agrees that the performance it’s not the same.

VirtualBox + Debian 9

Once I admitted that I should host my services on Linux I went for one of the virtualization solutions I know, VirtualBox, and I noticeed with glee that it was possible to install Debian. That was my first Linux distro. Feels a bit like returning home.

Then it was a matter of apt-getting make, gcc, vim, terminator, etc.

And I went to fetch Memcached sources. But, on my way there, I thought that since I had a “full fledged” Linux, why not checking around a little. And then Redis happened.
On paper, Redis’ features are a super set of Memcached. So I decided to give it a go. With a Linux in place, it was painless 4 steps to build from sources.

Then you end with something like this:

The base sources includes the tooling of the DB. Which is super nice.

C# + Redis: a lot of “Stacks”

Since I wanted a fast start on all this Redis biz. I checked in PluralSight for a fast start. That, in hindsight, was a bit of a mistake. Redis has a great amount of material in youtube, they even have a conference.

My first approach, was to write something in C# to feed a RedisDB. Following the advice from the PS Course I opted for ServiceStack.Redis and it works very well. Except for one detail. My budget for all this is exactly zero dolars, and servicestack is clear regarding its pricing Needless to say I reached the starter limits in exactly one hour.
All that was, clearly, my bad. I should’ve read the services better.

Thankfully there is a good list of other C# Clients and I decided to take a look into StackExchange.Redis. Yup I know the names are super confusing. But that’s what happened. Combine that with some fever and you have a glorious headache just waiting for you.

The code itself it’s reasonably clear, in a somewhat “unittest” format:

1
2
3
4
5
6
7
8
const string redisConnectionString = "YourServerIP:6379,allowAdmin=true";
ConnectionMultiplexer cm = ConnectionMultiplexer.Connect( redisConnectionString );
IDatabase db = cm.GetDatabase();
Assert.IsNotNull( db );
string value = "abdcdfge";
db.StringSet( "myKey", value );
string recovered = db.StringGet( "myKey" );
Assert.IsTrue( value.Equals( recovered ) );

With this library in place, projecting my data in a Redis-Friendly format is just a matter of wiggly Linq enough.

Consuming the cache from C++

Unfortunately the vast majority of the consumers of my domain work over C++ stacks. So there was the problem of finding a library that could communicate with the database with the minimum number of dependencies. I believe I have a good candidate here: cpp_redis

Painless to compile and try. But it’s still too soon to have a full formed opinion about it. I might post somehting more down the line.

Some lessons learned

  1. First and foremost you should check your tweets three times before clicking “send”. I wrote “StackExchange” when I wanted to say “ServiceStack” and all hell broke loose.
  2. Redis is part of the NoSQL family of DBs. No schema enforced. That gives you a lot of opportunities. But it puts a lot of pressure on the main keys.
  3. This DB supports several data types as primitives: sets, lists, hash, … the natural candidate to persist objects seems to be Hash, but I need to dig deeper on all this.
  4. The subscribe commands are incredibly powerful. Just for those alone, Redis is worth your time.
  5. This DB supports Lua on server side. And who doesn’t love Lua, right?

Comment and share

It’s that time of the year again. And, at the same time, it’s not. I’m not a young fella anymore and for the last 40 years I’ve been spending my Christmas at my hometown, visiting my friends and family.

But, … not this time.

The old land

As it should be painfully obvious by now, I’m Spanish. Old country. Long traditions. A land that predates the Roman empire (in a very literal sense, as discussed in this blog) I’ve spent the last months in Poland, arguably the heart of Europe. A land in the border between the West and the East.

But, yesterday something espectacular happened. I received a message. A simple note from the front desk. I had a package. A single innocuous figure: 11. That was the code.

A simple package from Correos the Spanish postal service. And their main product: the green package; that ironically it’s not green and it’s usually a pretty smacked package, that has lost it’s parallelepiped condition. And, on it … it was the name and caligraphy of my parents.

Care package

And, in such boxy ruin, I found “jamón serrano”, smoked cheese, mussels … and, a-la Faulkner, I’m back there. Stranded. Lost. Marooned.

– Take care dudes and dudettes, try to have a nice time, you won’t have another.

Comment and share

Fair warning: This articule is very technical and I’m not an expert on language analysis, I’m just a programmer with a problem to fix.

Extending strings in Localization

Think about the last time you played an online game. A competitive FPS, for example. The match ends, there’s a winner and the game displays:

Player xXxKillATonxXx wins the match

How can the game developers know in advance that Mr. xXxKillATonxXx was going to play? That’s either a string concatenation or a string substitution and sometimes game devs opt for the second choice. This means that in the source of the game we’ll have something in the like of (let’s not get in the discussion of if this is a good solution):

1
ID_WINNING_GAME = "Player {playerName} wins the match"

See it? that’s hell for QA. If you have, let’s say, 18 different text langs to localize: you need to be sure that those curly braces match, that the variable name “playerName” is correctly spelled on every language, and so on and so forth. That’s a reasonably easy problem to solve using RegExes but what happens when the UI team goes really wild and they allow something like this:

1
ID_WINNING_GAME = "<red>Player</red> {playerName} <blue>wins <italic>the</italic> match</blue>"

… well, in that case you don’t have a simple string anymore you have a DSL which is a way more complex problem to solve. And, from the QA perspective, it’s more difficult to track.

So at this point we have a combination of tags, variables that can be nested indefinitely, in a process that it’s incredibly error prone and very difficult to catch by eyeballing strings. It’ll also end in broken strings on screen during runtime, and that’s a risk for Certification.

And don’t forget that, due to grammar, different languages might have the tags in different places and maybe in different orders. The only rule is that any localized version should have the same tags and structure (in terms of tag nesting) than the source language.

Parsing DSLs, enters Pidgin

Facing that problem I had two alternatives: either program a recursive parser that’ll chew the strings and tokenize them properly, or use a more formal approach, in this case through Pidgin. The documentation of this library is pretty good and the test and samples folders contains a plethora of good small snippets that you can use right away.

So, let’s dig into this problem a little bit. For simplicity, I’m going to reduce the scope to single format strings that can be nested as much as we want, so let’s begin with the basics, let’s consume innocent strings:

1
2
Parser<char, string> Fluff = from f in Token(c => c != '<' &&
c != '>').ManyString()

simple enough, right? A call to Parse with that Parser will consume anything that doesn’t contain < or > and will be flagged as Success. On top of that Fluff also accepts empty strings.

We can make our lifes a little bit simpler by adding a bunch of simple parsers:

1
2
3
4
Parser<char, string> LT = String("<");
Parser<char, string> GT = String(">");
Parser<char, string> Slash = String("/");
Parser<char, Unit> LTSlash = LT.Then(Whitespaces).Then(Slash).Then(Return(Unit.Value));

so we have the basics of the language right there, LTs, GTs, slashes .. all the components. Let’s aim for something more complex, the tag Identifier, where we impose that the first element has to be a letter, in glorious LINQ like:

1
2
3
Parser<char, string> Identifier = from first in Token(char.IsLetter)	// "Token" makes this parser return the parsed content
from rest in Token(char.IsLetterOrDigit).ManyString()
select first + rest;

… we’re ready for consume a full string that starts with a format marker and ends with the closing of such format marker, something like this will do:

1
2
3
4
5
6
7
8
9
Parser<char, Tag> FormatTag = from opening in LT
from formatLabel in Identifier.Between( SkipWhitespaces )
from closing in GT
from body in Fluff // !!! Attention here
from closingTokenOpener in LTSlash
from closingLabel in Identifier.Between( SkipWhitespaces )
from closingTokenCloser in GT
where ( formatLabel == closingLabel ) // we assure that we're clossing the correct tag
select new Tag( formatLabel, body); // Let's imagine that you have this defined

If we’re lucky enough and the string that we need to parse is surrounded by a single format marker, this piece of code will take care of it and return a “Tag” object. That we’ll be able to compare and consume later.

But that’s not what we want to solve, we should change that call to Fluff for something that can potentially consume more tags that live embedded in the string. Also, we need to take care of a string that starts and ends with normal text and happens to have a Tag in the middle, let’s do that now:

1
2
3
4
5
Parser<char, Tag> tagParser =
from preFluff in Fluff
from items in Try( FormatTag )
from postFluff in Fluff
select items;

see that try modifier? That’s what enables the parser to backtrack in case of failure. In essence you don’t “lose the input” and you can use other rules. Incredibly useful. But still, we can’t consume several of this rules, let’s fix that now:

1
2
3
Parser<char, IEnumerable<Tag>> stringParser =
OneOf( Try( tagParser.AtLeastOnce() ),
Try( Fluff.ThenReturn(null as IEnumerable< Tag > ) ) );

That needs some unpacking:

OneOf accepts a sequence of Parser and will try to execute them in sequence for left to right, once one consumes input that one is selected, otherwise it fails. In this case we’re trying to either parse a tag or simple and innocent text.
At least once executes the previous parser one or more times and accumulates the output into an Enumerable container.
ThenReturn Let’s you return whatever you want once a Parser has completed succesfully, in this case we need to change the output of fluff from string to IEnumerable Tag. At the end, the goal is not to know what the string contains but just to ensure that the structure remains between different languages.

So, going back to our “FormatTag” Parser, we need to tweak it a little, with:

1
2
3
4
5
6
7
8
9
Parser<char, Tag> FormatTag = from opening in LT
from formatLabel in Identifier.Between( SkipWhitespaces )
from closing in GT
from body in stringParser // <<<<<<<<
from closingTokenOpener in LTSlash
from closingLabel in Identifier.Between( SkipWhitespaces )
from closingTokenCloser in GT
where ( formatLabel == closingLabel )
select new Tag( formatLabel, body);

And there we have it, nested strings, embedded indefinitely, with your memory as the only limitting factor in this solution.

This is, of course, an incomplete solution. But it covers the main points of the grammar in place: recursion and tag verification.

Some lessons

  1. Recursive grammars become incredibly complex to parse. Using TDD is a must.
  2. Chop, chop, chop your problem. Every parser should do the absolute minimum, combining cases is the shortest route to failure and headaches.
  3. Test for End() Sometimes the strings are empty or you want to check that you’ve consumed the whole input.
  4. OneOf + Try is a patter on its own. The library might have something more compact, but with my knowledge, I like to use it.

Not data driven, but flexible enough

One of my few regrets with this solution is that it’s not completely data - driven. Other 2-step solutions would’ve been more flexible. Imagine a grammar description in an external file that it’s compiled in runtime and ends in an in memory parser that you can use as you please. That’d been way cooler, but also more complex, at least with my current knowledge of this libraries and technologies.

Comment and share

We’ve been playing the new entry on the South Park games: The fractured but whole.

This is a game for fans of the show. If you aren’t up to date with it you’re going to miss a lot of references and gags. South Park it’s not for everybody, but I loved it since episode 1. There’s something about the creativity of the show that grabs me.

The story moves from a fantasy setting into a superhero game. Cartman and the kids go full blown Marvel Universe: Sworn enemies, franchises, plans for series, movies, comics, cross overs, the return of the glorious Mysterion and Jimmy as “The Flash”. Their mocking of Marvel’s production plans is on point and funny.
The first part of the game is narratively a little bland. But the second half is a rollercoaster of twists, gags, creatures with too many asses and super mutants. We laughed a lot playing it.
Maybe, the fart jokes can go a little bit stale after 12 hours of constant references to flatulence, winds and gases. Still worth it.

I played the PS4 version and it seems like some of the dialogues are dropped, the mouths are animated, but without sound. That was the only technical issue we encountered.

How does it compare to the first?

Well, first of all, it looks exactly the same as the previous one. It’s a South Park episode that lasts around 15 hours. Some of the loading times are a little long, and, believe it or not, I noticed some framedrops here and there. For a game that is moving nothing on screen it’s particularly surprising.

The finishing acts of the first game The Stick of Truth felt like a chore. Once the party was formed you keep persisting on the same moves and tactics. It was getting tiresome.

Mechanically, the new game is more tactical than the first part. The title abandons the Final Fantasy approach to combat and introduces a grid system. You can move your characters around and that makes a real difference. It’s a very simple game though, the combat is not the main selling point.

If you compare the game scripts, my opinion is that the first one is superior to this entry in the series. It has loads of memorable moments. The Fracture But Whole packs less punch.

The character creation has been modified too, now you can play with a girl. It’s not like anyone cares. Doesn’t matter if you create an heterosexual white boy, the rednecks will still go after scum like you.

In terms of duration, they’re very similar. And they’re packed with twists and classic South Park moments.

Want to see more?

If you’re interested you can watch the initial minutes of the game, including character creation from our stream last week (spoilers of the initial segment)

Comment and share

jc_bellido

My name is Carlos Bellido and I do code for a living. I’m currently living at Stockholm, employed by DICE. It happens that I like to swim quite a lot, hence the title.


I build production tools and general code at DICE http://www.dice.se