In [ ]:
import sys
sys.path.append('/opt/rhc')
and importing a couple of components.
In [ ]:
import rhc.micro as micro
import rhc.async as async
In [ ]:
def ping(request):
return {'ping': 'pong'}
We can take a look ping
in order to know how to refer to it.
In [ ]:
ping
In [ ]:
p = micro.load_server([
'SERVER useless 12345',
'ROUTE /test/ping$',
'GET __main__.ping',
])
load_server
The load_server
helper function
dynamically loads server definitions. In this case, the definition is contained
in a list
, but could also be loaded from a file by specifying the file's name, or by specifying
a dot-separated path to the file in the python code tree.
In a microservice implementation, the server definitions are included in the
micro
file, or in one of the imported files.
This function is included for experimentation and development.
The SERVER
directive provides a name
and a listening port
for a service. The socket is
started and listens for incoming connections.
All by itself, a SERVER
doesn't provide much.
The ROUTE
directive watches for a specific HTTP resource
on incoming connections. In this case, the resource is the exact string /test/ping
.
Even when combined with a SERVER
, a ROUTE
doesn't provide much.
The GET
directive tells micro what REST handler to run if an HTTP GET occurs on the most
recently defined ROUTE. In this case, we specify the ping function defined earlier. The
handler is dynamically imported when the server is started.
Other HTTP methods, PUT
, POST
, DELETE
, can be used as directives as well.
The useless
server is now listening, but we need a way to connect to it.
We start by defining a connection to the listening port:
In [ ]:
con = async.Connection('http://localhost:12345')
And then doing a GET
on the /test/ping
resource.
In [ ]:
async.wait(con.get('/test/ping'))
The async.wait
function is pulling double duty here by running both the server code and the client code until the client code completes. Each network event causes the microservice (here running inside wait
) to perform some action in response to the event. We'll look at each action in turn.
When con.get
(aka the client
) is executed, it starts a connection to localhost:12345
, and waits.
It doesn't explicitly wait for anything, it just stops processing. There is nothing to do until another network event occurs.
When the SERVER
listening on port 12345
receives the connect
, it accepts the call and waits.
For the curious: The microservice periodically polls the socket listening on port 12345
to see if it is "readable". If it is readable, that means that another socket is trying to connect.
When this happens, the microservice "accepts" the connection, creating a new socket which
represents the microservice's side of the connection.
TCP will make sure that the client
side
of the connection is notified that the connection is complete.
When the client
is connected, it sends a GET /test/ping
to the server
as an HTTP document and waits.
When the server
receives the entire HTTP document, it
matches it to the ROUTE
and GET
directives, and
calls __main__.ping
, which immediately returns
the dictionary {'ping': 'pong'}
.
The server
sends the dictionary as a json
document in an HTTP response
to the client
and waits.
For the curious: The microservice started polling the connected socket as soon as the
connection was completed in the accept
step above. When data arrives on the socket, the socket
becomes "readable" which tells the microservice that it's possible to read some data. Data is read
and parsed until an entire HTTP document is received.
When the client
receives the entire HTTP document, it indicates to the wait
function that it is done.
The wait
function prints the json
document and stops.
In [ ]: