During the last months I’ve been involved in an infrastructure project. The idea is to offer on-demand resources. Think Jenkins or GitLab or any render queue. In my case, users are working from different countries and time zones. This is one of the cases where building a web-based front end makes sense.
The challenge: I’ve never built anything mid sized on web, only micro solutions that needed close to zero maintenance and were extremely short-lived. To make things more interesting the backend was offering its services through gRPC.
This is a piece about my second project using react. The first one, even if functional, was a total mess. I’m not suggesting the approach contained here makes sense to everyone but it has worked for me and I think keeping it documented has value.
From that point a simple:
npm install -g create-react-app
npm install react-admin
will get you started. You can see a demo of react-admin from marmelab team here:
With all that in place, how to start? After checking with more experienced full-time web devs they recommended me to use react-admin (RA from now) as a starting point. Later I realized how much RA’s architecture will impact the rest of the solution. But as a starting point it is great. The documentation is really good, I learnt a lot from it. From the get go you’ll have a framework where it’s easy to:
- List, Show detail, Edit and delete flows
- Filtering results
- Actions in multiple selected resources
- Related resources and references, aka: this object references that other thing make navigation between resources, simple.
Half way during the development I found out about react-hooks. I strongly suggest to watch this video, well worth the time I put into it:
I used only only a fraction of the potential Hooks offer and that was more than enough. The resulting code is leaner and more expressive. If I need to write another web using react I’ll try to squeeze more from them.
RA is based on a large number of 3rd party libraries. For me the most important 2 are:
- React-Redux: I use it mainly in forms and to control side effects. Some of the forms I have in place are quite dense and interdependent.
- Material-ui: Controls, layout and styles. According to what I’m seeing around lately it has become an industry standard. Out of the box is going to give you a Google-y look and feel.
Unless you’re planning to become a full time web developer I don’t believe it’s particularly useful to dig too deep into those two monsters of libraries. But having a shallow knowledge of the intent of the libraries can be quite useful.
But the trick here is that you can’t directly connect to a gRPC backend from the browser. In the documentation, Envoy is used to bridge those. In other scenarios it’s possible to use Ambassador if your infrastructure supports it.
Since the backend was under construction I decided to write a little mock based on the .proto file using Python. Starting with the .proto file I’m returning the messages populated with fake but not random data. The messages are built dynamically through reflection from the grpc-python toolset output. The only manual work needed is to write the rpcs entry points than are automatically forwarded and answered by the mock.
Once the fake server is written you still need to make it reachable from the web browser. It’s here where docker-compose made my life way simpler. I wrote a compose with envoy and my server connected and I had a reliable source of sample data to develop the UI. In this case I was lucky since my office computer is running on a Pro version of Win10 making Hyper-V available and the Docker toolset for Windows machines have improved a lot lately.
It’s perfectly possible to achieve similar results using non-pro versions of Windows or even simpler by using a Linux or Mac desktop.
This small solution turned to be quite important down the line given the amount of iteration the backend went through. In the web world there’re many great API / backend mocking solutions based on REST calls. But when you’re working with gRPC the ecosystem is not as rich (or I didn’t found anything mature at that moment)
In my domain and due to API restrictions I was getting different categories of resources through the same gRPC points. After thinking a bit about it the simplest solution I found was to implement pre-filtered data providers and give them resource relevant names. In other words I ended with a collection of data providers that were internally pointing at the same gRPCs but with relevant names. This allowed me to offer meaningful routes while keeping the UI code isolated from the backend design.
Containers, Docker in my case, are becoming more and more important as I go forward. If you know nothing about them I strongly suggest you to put some time in them. It can be a game changer. Even if your intent is to keep your dev environment as clean as humanly possible.