Dispatching tasks to multiple Pharo VMs using SystemProcess

Doing a research work in robotic domain using Pharo as a prototype and implementation tool (via phaROS) is a whole new experience. It is quite impressive to see how quick an implementation idea becomes a working prototype/solution in Pharo thanks to its productive development environment. Most of my robotic applications are critical tasks which require real-time performance, some of them are heavily resource-demanding (CPU). Due to the single process nature of Pharo, running these tasks on the same VM results in a performance bottleneck, thus sometime, violate the real-time requirement of the application. Common solution to this problem is to dispatch these tasks to several native system processes to boost the performance. Unfortunately, this feature is not supported in current Pharo. My goal in this case is to have something that allow to:

  • maintain the use of Pharo's productive environment, so implementation the applications in another language is not an option
  • be able do dispatch tasks to different system processes, (in this case, tasks running on different VMs using the same image), and have a simple inter-VM communication mechanism

And that is when SystemProcess plays its role.

PTerm: yet another Terminal Emulator for Pharo

PTerm

I use Unix terminal a lot in work, when i work with Pharo and ROS (PhaROS), switching regularly between Pharo and native terminal application (for ROS command line) is kind of inconvenient. I've been thinking of using a terminal emulator application for Pharo. Googling around, i found out that there is no such thing that is ready for production work on modern Pharo, except a prototype work of Pavel Krivanek available at: https://github.com/pavel-krivanek/terminal. However, that code is messy, buggy, and not ready for production work . So i decided to take my time to work on it.

WVNC: web based protocol and API for accessing VNC server from browser via websocket

This post is deprecated, WVNC now is a part of AntOS eco-system. The easiest way to setup a Web base VNC client is to used the AntOS docker image that is presented in this post.

As with AntOS and its applications, one can remotely access and edit server resources from browser in a desktop-like manner. However, sometime these web based applications are not enough for some specific tasks. For a long time, I've thinking of a web based API for controlling remote desktop right from the browser ( or AntOS application). The VNC protocol is a good starting point. After playing around with libvncserver/libvncclient, i came up with wvnc a web based protocol and API for accessing VNC servers using websocket.

Controlling a Turtlebot using PhaROS: Goals planning and Automatic docking

This is a demonstration of my current work on controlling robot using ROS and PhaROS. For that task, I've developed a dedicated PhaROS package that defines:

  1. A base framework for ROS based visualization such as map, robot model, robot trajectory, etc.
  2. An Event-driven API for robot controlling
28/08/2018 ROS, SLAM, Metrics, evaluation, SSIM, MSE, NE

Evaluation of grid maps produced by a SLAM algorithm

When evaluating the performance of a SLAM algorithm, quantifying the produced map quality is one of the most important criteria. Often, the produced map is compared with (1) a ground-truth map (which can be easily obtained in simulation) or (2) with another existing map that is considered accurate (in case of real world experiment where the ground-truth is not always available ).

Basically, grid maps are images, so image similarity measurement metrics can be used in this case. In this post, we consider three different metrics: Mean Square Error (MSE), K-nearest based normalized error (NE) and Structure Similarity Index (SSIM)

AntOSDK tutorial: Developing a simple text editor

AntOS API in a nutshell

AntOS provides an abstract API for application development. The core API contains three main elements: the UI API, the VFS API and the VDB API, as shown in the following graph:

The UI API defines the basic UI elements such as Window, List, Tree, Dialogs, etc, and provides a generic interface for application and dialog UI development. UI design consists two steps: (1) the first step is to layout UI elements using antOS' scheme syntax (in XML format); (2) the second step is to handle user interaction using the coffeescript/ Javascript API.

AntOS v 0.2.4-alpha release

I have too many work in the last few months and don't have time to deal with some known bugs of AntOS until now. This v0.2.4-a release focuses on improvement and bugs fix

Project page: https://github.com/lxsang/antos
Demo: https://os.lxsang.me/
AntOSDK tutorial: https://blog.lxsang.me/r:id:20

Applications developed using AntOSDK can be found on this repository: https://github.com/lxsang/antosdk-apps. They can be used as example projects for AntOSDK

Change logs

  • Improve API code
  • Fix the menu UI not updating/refreshing bug. This bug affects some applications: MarketPlace, AntOSDK
  • Fix the fileview UI updating automatically when its window title changes. This bug affects following applications: NotePad, AntOSDK
  • Fix the setting bug in File and release new version
  • Fix bug on MarketPlace and release new version
  • Fix bug on NotePad and release new version
  • Fix bug on AntOSDK and release new version

WARNING: Due to some recent attacks on my server, the web terminal access is disabled for the user: demo. Someone tried to run a TOR relay on my server using that user, so i decided to disable the terminal access on the demo user. You can still login, but you can't use the shell

Linux Tips and Tricks

This post contains some tips and tricks that helps resolve problems that i've encountered when working with Linux, mostly Ubuntu.

Ubuntu: Install 32 bits libraries on 64 bits system

Some of my applications are 32 bits only which sometime depend on several 32 bits libraries. By default, ubuntu installed only the 64 bits version of these libraries. To installed the 32 bit ones, we need enable the i386 architecture using dpkg, these following commands should be executed as root:

Simple (naive) document clustering using tf-idf and k-mean

When i developed this blog (using my own client-server platform such as web server, back-end, front-end, etc., built from ash/scratch :) ), i simply designed it as a simple "note book" where i put my ideas or some stuffs that i have done. So, initially, there are no category no advance feature like post suggestion based on current post, etc. It is just a bunch of posts sorting by date. The thing is, i usually work on many different domains (robotic, IoT, backend, frontend platform design, etc.), so my posts are mixed up between different categories. It is fine for me, but is a real inconvenience for readers who want to follow up their interesting category on the blog. Of course, i could redesign the blog and add the missing features by messing around with the relational database design (i'm using SQLite btw), manually classifying the posts in the back-end, etc. But, i'm a kind of lazy people, so i've been thinking of a more automatic solution. How about an automatic document clustering feature based on a data mining approach ? Here we go!

Adding noise to odometry data published by the Gazebo simulator

I've had funny time playing around with the Gazebo simulator for autonomous robot exploration. One thing I've encountered is that the odometry data provided by Gazebo is so perfect that, sometime, makes the simulation less realistic. I used a Turtlebot model as robot model in my simulations. Googling around, i didn't find any solution of adding noise to the odometry data of this robot (using the URDF file). I then decided to develop a dedicated ROS node allowing me to add some random noise to the Gazebo's odometry data.

The robot motion model

First thing fist, we need to understand the robot motion model. There are many motion models, but in the scope of this article, we focus only on the odometry motion model. Often, odometry is obtained by integrating sensor reading from wheel encoders, it measures the relative motion of the robot between time \(t\) and \(t-1\) or \((t-1,t]\). In 2D environment, a robot pose is represented by a point \((x,y)\) and an orientation (rotation angle) \(\theta\), so the robot pose at the time \(t-1\) and \(t\) are denoted by:
$$p_{t-1}=(x_{t-1},y_{t-1},\theta_{t-1})$$
$$p_{t}=(x_t,y_t,\theta_t)$$

Powered by antd server, (c) 2017 - 2024 Dany LE.This site does not use cookie, but some third-party contents (e.g. Youtube, Twitter) may do.