# Create Custom Objects
Make sure you have everything you need before proceeding:
- You understand the concepts of Protobuf.
- You have completed the introductory CosmJS tutorial.
- Go and npm are installed.
- You have finished the checkers Go blockchain exercise. If not, you can follow the tutorial here, or just clone and checkout the relevant branch (opens new window) that contains the final version.
With your checkers application ready for use, it is a good time to prepare client elements that eventually allow you to create a GUI and/or server-side scripts. Here, you will apply what you have learned about creating your own custom CosmJS interfaces.
Before you can get into working on your application directly, you need to make sure CosmJS understands your checkers module and knows how to interact with it. This generally means you need to create the Protobuf objects and clients in TypeScript and create extensions that facilitate the use of them.
# Compile Protobuf
You will have to create a client
folder that will contain all these new elements. If you want to keep the Go parts of your checkers project separate from the TypeScript parts, you can use another repository for the client. To keep a link between the two repositories, add the client parts as a submodule to your Go parts:
Replace the path with your own repository. In effect, this creates a new client
folder. This client
folder makes it possible for you to easily update another repository with content generated out of your Go code.
Create a folder named scripts
in your project root. This is where you will launch the Protobuf compilation:
In the scripts
folder, or in your Docker image, install a compiler:
Now install your additional modules:
Create the folder structure to receive the compiled files:
Check what Cosmos SDK version you are using:
This may return:
Download the required files from your .proto
files:
Now compile:
You should now have your TypeScript files.
In order to easily repeat these steps in the future, you can add them to your existing Makefile
with slight modifications:
Whenever you want to re-compile them, run:
You have created the basic Protobuf objects (opens new window) that will assist you with communicating with the blockchain.
# Prepare integration
At this point, you have the generated
files in your client
folder. If you have made this client
folder as a Git submodule, then you can work directly in it and do not need to go back to the checkers Cosmos SDK:
Also, if you use Docker and did not go through the trouble of building the Docker image for the checkers Cosmos SDK, you can use the node:18.7-slim
image.
Install the Protobuf.js package in your client project:
At a later stage, you will add checkers as an extension to Stargate, but you can define your checkers extension immediately. The canPlay
query could make use of better types for player and position. Start by declaring them in client/src/checkers/player.ts
:
Your checkers extension will need to use the CosmJS Stargate package. Install it:
Now you can declare the checkers extension in src/modules/checkers/queries.ts
:
Do not forget a setup function, as this is expected by Stargate:
You may have to add these imports by hand:
Now create your CheckersStargateClient
in src/checkers_stargateclient.ts
:
# Integration tests
It is possible to already run some integration tests against a running checkers blockchain.
# Preparation
Install packages to run tests.
Describe how to connect to the running blockchain in a .env
file in your project root. This depends on where you will run the tests, not on where you run the blockchain:
Alternatively, use whichever address connects to the RPC port of the checkers blockchain.
This information will be picked up by the dotenv
package. Now let TypeScript know about this in an environment.d.ts
file:
Also add your tconfig.json
as you see fit:
Add the line that describes how the tests are run:
# First tests
Because the intention is to run these tests against a running chain, possibly a fresh one, they cannot expect too much, such as how many games have been created so far. Still, it is possible to at least test that the connection is made and queries pass through.
Create test/integration/system-info.ts
:
And create one for stored games:
Note the forced import of import _ from "../../environment"
, to actively inform on the string
type (as opposed to string | undefined
) and avoid any compilation error.
# Prepare your checkers chain
There is more than one way to run a checkers blockchain. For instance:
If you came here after going through the rest of the hands-on exercise, you know how to launch a running chain with Ignite.
If you arrived here and are only focused on learning CosmJS, it is possible to abstract away niceties of the running chain in a minimal package. For this, you need Docker and to create an image:
Get the
Dockerfile
:Build the image:
If you have another preferred method, make sure to keep track of the required RPC_URL
accordingly.
If you are curious about how this Dockerfile-standalone
was created, head to the run in production section.
# Launch the tests
Launch your checkers chain. You can choose your preferred method, as long as it can be accessed at the RPC_URL
you defined earlier. For the purposes of this exercise, you have the choice between three methods:
When using Docker, note:
--name checkers
either matches the name you wrote inRPC_URL
, or can be passed as an environment variable to another container to override the value found in.env
.--network checkers-net
, which is reused shortly if you also run yournpm
tests in Docker. See the paragraph on Docker network, later in this section.
Now, if you run the tests in another shell:
This should return:
The only combination of running chain / running tests that will not work is if you run Ignite on your local computer and the tests in a container. For this edge case, you should put your host IP address in --env RPC_URL="http://YOUR-HOST-IP:26657"
.
# A note on Docker networks
You may not have used Docker up to this point. The following paragraphs acquaint you with a Docker user-defined bridged network.
If you plan on using Docker Compose at a later stage, having a first taste of such networks is beneficial. Docker Compose can be used to orchestrate and launch separate containers in order to mimic a production setup. In fact, in the production section of this hands-on exercise you do exactly that. If you think this could eventually be useful, you should go through this section. You may want to redo this section with Docker (opens new window).
Earlier you ran the commands:
This produced the following results:
- A Docker network was created with the name
checkers-net
. If containers are started in this network, all ports are mutually accessible. - Your container started in it with the resolvable name of
checkers
. - With
-p 26657:26657
, port 26657 was forwarded to your host computer, on top of being already shared on thecheckers-net
network.
Then, for tests:
When you ran:
Your tests, running on the host computer, accessed the checkers chain from the host computer via the forwarded port 26657. Hence
RPC_URL="http://localhost:26657"
.When you ran:
Your tests, running in a different container, accessed the checkers chain within the
checkers-net
Docker network thanks to thecheckers
name resolution. HenceRPC_URL="http://checkers:26657"
.In particular, the
-p 26657:26657
port forwarding was not necessary. You can confirm that by stopping your chain and starting it again, this time without-p
.
Docker networks are explored further in the next section.
When you are done, if you started the chain in Docker you can stop the containers with:
To summarize, this section has explored:
- The need to prepare the elements that will eventually allow you to create a GUI and/or server-side scripts for your checkers application.
- How to create the necessary Protobuf objects and clients in TypeScript, the extensions that facilitate the use of these clients, so that CosmJS will understand and be able to interact with your checkers module.
- How to use Docker to define a network to launch separate containers that can communicate, for the purpose of integration testing.