Improving Code Coverage for Loklak PHP API

Tests


This week, I added tests for suggest, map, markdown, push and susi APIs and fixed tests for user topology related queries. PHPUnit was used as the testing framework since it provides code-coverage support.

Loklak PHP API now has a good test-suite support and associated examples to help you use our services

Below are the Tests that were added.

testPushto test Push API.Selection_001

testSusito test newly added Susi API. Selection_003

Refer to this and this for more info about Susi

testMap – to test map API.

testMarkdown – to test markdown API.Selection_002

For more detailed information regarding the entire Loklak PHP API test-suite, refer to this.

In above process, code-coverage was increased from 33% to 61%. Test-suite continuously updated as new APIs are added to Loklak.

Source support for Search API


Apart from that, since Loklak is scaling beyond Twitter. Source argument has been added to the search API to define the source of search results. As far as wordpress plugins are concerned, since they only require twitter results for now, source has been added as a default argument. See below code: Selection_004

Improving Code Coverage for Loklak PHP API

Susi support for Loklak APIs

Here at Loklak, we are striving continuously for innovation. Continuing with this trend, we recently launched ‘Susi – The chat bot’. Please refer to this previous blog post by Damini.

Along with the chat bot, Susi query support was added to Loklak Python and PHP APIs. Susi can be queried from localhost as well as other online loklak peers.

Susi API function added to Python API(as shown below). See full implementation here.

def susi(self, query=None):
   """Hits Susi with the required query and returns back the susi response"""
   susi_application = 'api/susi.json'
   url_to_give = self.baseUrl + susi_application
   self.query = query
   if query:
      params = {}
      params['q'] = self.query
      return_to_user = requests.get(url_to_give, params=params)
      if return_to_user.status_code == 200:
          return return_to_user.json()
      else:
          return_to_user = {}
          return_to_user['error'] = ('Looks like there is a problem in susi replying.')
          return json.dumps(return_to_user)
      else:
          return_to_user = {}
          return_to_user['error'] = ('Please ask susi something.')
          return json.dumps(return_to_user)

A sample usage of Susi API in python could be:

from loklak import Loklak
query = "Hi I am Zeus"
l = Loklak()
result = l.susi(query)
print result

Susi integration with PHP API(see below). See full implementation here.

public function susi($query=null) {
	$this->requestURL = $this->baseUrl . '/api/susi.json';
	$this->query = $query;
	if($query) {
		$params = array('q'=>$this->query);
		$request = Requests::request($this->requestURL, array('Accept' => 'application.json'), $params);
		if ($request->status_code == 200) {
			return json_encode($request, true);
		}
		else {
			$request = array();
			$error = "Looks like Susi is not replying.";
			$request['error'] = array_push($request, $error);
			return json_encode($request, true);
		}
	}
	else {
		$request = array();
		$error = "Please ask Susi something.";
		$request['error'] = array_push($request, $error);
		return json_encode($request, true);
	}
}

Sample usage of Susi API in PHP:

include('loklak.php');
$loklak = new Loklak(); 
$result = $loklak->susi('Hi I am Zeus');
$susiResponse = json_decode($result);
$susiResponse = $susiResponse->body;
$susiResponse = json_decode($susiResponse, true);
var_dump($susiResponse);

Tests for above-mentioned functions have been added to the respective API suite. Refer to this and this.

Try Social Universe Super Intelligence!

Ask questions, interact with it. I am pretty sure that you would like it!

Susi support for Loklak APIs

Loklak getting the followers from Weibo

Like twitter loklak has started to scrap the Weibo data. Sina Weibo is a Chinese microblogging (weibo) website. Akin to a hybrid of Twitter and Facebook, it is one of the most popular sites in China, in use by well over 30% of Internet users, with a market penetration similar to the United States’ Twitter.

I have started to scrape the user’s bio page which looks something like this.

Selection_179

The above image is a user’s profile on Weibo. It has two frames of profile, one is a user’s bio which is similar to facebook’s bio page. The second one is completely similar to the twitter’s format. So i scraped the user’s followers details which was in the form of table. Using JSoup we can scrap tables easily.

Selection_180

Selection_181

This is how the table on the profile page got scraped and can get the followers data from Weibo. Stay tuned for more scraping updates.

Loklak getting the followers from Weibo

Convert web pages into structured data

Loklak provides a new API which converts web pages into structured data in JSON. The genericscraper API helps you to scrape any web page from a given URL and provides you with the structured JSON data. Just place the URL in the given format http://localhost:9000/api/genericscraper.json?url=http://www.google.com

This scrapes generic data from a given web page URL, for instance this is the current URL after scraping the main Google search page.

{
  "Text in Links": [
    "Images",
    "Maps",
    "Play",
    "YouTube",
    "News",
    "Gmail",
    "Drive",
    "More »",
    "Web History",
    "Settings",
    "Sign in",
    "Advanced search",
    "Language tools",
    "हिन्दी",
    "বাংলা",
    "తెలుగు",
    "मराठी",
    "தமிழ்",
    "ગુજરાતી",
    "ಕನ್ನಡ",
    "മലയാളം",
    "ਪੰਜਾਬੀ",
    "Advertising Programs",
    "Business Solutions",
    "+Google",
    "About Google",
    "Google.com",
    "Privacy",
    "Terms"
  ],
  "Image files": [],
  "source files": [],
  "Links": [
    "http://www.google.co.in/imghp?hl=en&tab=wi",
    "http://maps.google.co.in/maps?hl=en&tab=wl",
    "https://play.google.com/?hl=en&tab=w8",
    "http://www.youtube.com/?gl=IN&tab=w1",
    "http://news.google.co.in/nwshp?hl=en&tab=wn",
    "https://mail.google.com/mail/?tab=wm",
    "https://drive.google.com/?tab=wo",
    "https://www.google.co.in/intl/en/options/",
    "http://www.google.co.in/history/optout?hl=en",
    "/preferences?hl=en",
    "https://accounts.google.com/ServiceLogin?hl=en&passive=true&continue=http://www.google.co.in/%3Fgfe_rd%3Dcr%26ei%3DR_xpV6G9M-PA8gfis7rIDA",
    "/advanced_search?hl=en-IN&authuser=0",
    "/language_tools?hl=en-IN&authuser=0",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=hi&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=bn&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=te&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=mr&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=ta&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=gu&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=kn&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=ml&source=homepage",
    "http://www.google.co.in/setprefs?sig=0_VODpnfQFFvCo-TLhn2_Kr9sRC2c%3D&hl=pa&source=homepage",
    "/intl/en/ads/",
    "http://www.google.co.in/services/",
    "https://plus.google.com/104205742743787718296",
    "/intl/en/about.html",
    "http://www.google.co.in/setprefdomain?prefdom=US&sig=__SF4cV2qKAyiHu9OKv2V_rNxesko%3D",
    "/intl/en/policies/privacy/",
    "/intl/en/policies/terms/",
    "/images/branding/product/ico/googleg_lodp.ico"
  ],
  "language": "en-IN",
  "title": "Google",
  "Script Files": []
}

I wrote a generic scraper using the popular HTML scraper Java library JSoup. I scraped the generic fields like title, images, links, source files and other text between links. After the generic scraper was ready I registered the API endpoint as api/genericscraper.json along with the servlet.

Selection_128

I loaded the page from the given URL and took the value from the variable url. I scraped every tag by using getElementByTag. After storing the elements, I looped through the list and retrieved the attributes from each tag. I stored the data accordingly and pushed into the JSONArray. After the necessary scraping I pushed the JSON Arrays into a JSON object and pretty printed it.

Selection_129

There is an app consuming the above API called WebScraper under the Loklak apps page.

Selection_122.png

Convert web pages into structured data

The apps page gets a makeover

Huge transformations took place in the apps section. It’s amazing to see the count of the Loklak apps shoot up. There are more than 20 apps built using various APIs provided by Loklak. Inorder to dynamically manage the apps, apps.json was introduced.

screenshot

Initially there was a simple card layout. With the required name and the description of the app.

blha

The new tiles design has screenshots of the app and the details of the app is shown when the mouse is hovered on the screenshot. The apps are categorized based on the type of API being used. Accordingly the left navigation bar consists of all the categories.

This page is dynamic and takes the data from a new JSON object from apps API. It has all the apps and their details along with fields like categories list and the corresponding apps under it.

lklk

The above array “categories” was used to get all the list of categories on the left navigation bar.

lklk2

The above category object is being used for getting the list of the apps under each specified category. This is being used to display the app’s details when a specific category is being clicked.

The above JSON made it easy to categorize the apps into various sections and gave a new look to the page. The tiles design was completely designed using standard CSS classes. The page is responsive and dynamic.

The apps page gets a makeover

Loklak API SDK Now supports golang

The Go programming language is quite a recent language. It’s a statically typed and compiled language un like Python or other scripting languages, It has a greater memory safety compared to C, supports garbage collection and an inbuilt support to make http and network requests using the "net/http" and "net/url" packages that are present in it. Go is scalable to very large systems like Java and C++. It also makes things productive and is easily readable because of the far lesser number of keywords that it has.

golang

Some of the key things that we notice with Golang coming from a C/C++ background is that there’s nothing called a class. Wait, what ? Exactly, you read it right, Go considers class to be mentioned as a struct . So the loklak object data structure which we will be using throughout golang’s support for the various API requests is as follows

import (
    "encoding/json"
    "fmt"
    "net/http"
    "net/url"
    "os"

    "github.com/hokaccha/go-prettyjson"
)

So in golang, it’s recommended to have the built in packages first followed by the libraries that you’re using remotely, we’re using the pretty print json library which is at github.com/hokaccha/go-prettyjson. So since we follow the DRY (Don’t Repeat Yourself) method, we write a public function called getJSON as follows

func getJSON(route string) (string, error) {
	r, err := http.Get(route)
	if err != nil {
		return "", err
	}
	defer r.Body.Close()

	var b interface{}
	if err := json.NewDecoder(r.Body).Decode(&b); err != nil {
		return "", err
	}
	out, err := prettyjson.Marshal(b)
	return string(out), err
}

If you’re coming from a C++/C background you’d notice something really odd about this, the return types need to be mentioned in the function header itself, so a function func getJSON(route string) (string, error) takes a string by the variable route as input and returns two values (string, error) as the return types. The above code takes the request URL and returns the corresponding JSON response.

Golang methods are generally not a very preferred type in such REST based scenario API development like that of Loklak, hence most of the queries can be directly made using functions. But we initially have a function for a method where we can set the loklak server URL to take.

// Initiation of the loklak object
func (l *Loklak) Connect(urlString string) {
	u, err := url.Parse(urlString)
	if (err != nil) {
		fmt.Println(u)
		fatal(err)
	} else {
		l.baseUrl = urlString
	}
}

So this takes a string urlString as an parameter and creates a method called Connect() by using a loklak Object, this updates the base URL field of the loklak object. This is obtained as follows in the main package.

loklakObject := new(Loklak)
loklakObject.Connect("http://loklak.org/")

The Go language has built-in facilities, as well as library support, for writing concurrent programs. Concurrency refers not only to CPU parallelism, but also to asynchrony: letting slow operations like a database or network-read run while the program does other work, as is common in event-based servers. These could prove to be very useful in building apps with Go using loklak and to use loklak data, the high parallelism can be very useful for developers using data from loklak and building applications based on this.

Some very interesting things in golang is that golang doesn’t support function overloading or default parameters, hence the search API which was implemented using default parameters in PHP and Python can’t be implemented that way in Go. This has been tackled by using the search() function and prepackaging the request to be made to it as a loklak object.

// Search function is implemented as a function and not as a method
// Package the parameters required in the loklak object and pass accordingly
func search (l *Loklak) (string) {
	apiQuery := l.baseUrl + "api/search.json"
	req, _ := http.NewRequest("GET",apiQuery, nil)

	q := req.URL.Query()
	
	// Query constructions
	if l.query != "" {
		constructString := l.query
		if l.since != "" {
			constructString += " since:"+l.since
		}
		if l.until != "" {
			constructString += " until:"+l.until
		}
		if l.from_user != "" {
			constructString += " from:"+l.from_user
		}
		fmt.Println(constructString)
		q.Add("q",constructString)
	}
	if l.count != "" {
		q.Add("count", l.count)
	}
	if l.source != "" {
		q.Add("source", l.source)
	}
	req.URL.RawQuery = q.Encode()
	queryURL := req.URL.String()
	out, err := getJSON(queryURL)
	if err != nil {
		fatal(err)
	}
	return out
}

To use this search capability, one needs to create and package the requested parameters and then call this function with the loklak query.

func main() {
	loklakObject := new(Loklak)
	loklakObject.Connect("http://loklak.org/")
	loklakObject.query = "fossasia"
	loklakObject.since = "2016-05-12"
	loklakObject.until = "2016-06-02"
	loklakObject.count = "10"
	loklakObject.source = "cache"
	searchResponse := search(loklakObject)
	fmt.Println(searchResponse)
}

The golang API has a great potential in aiding loklak server and being used by applications running golang and looking to build highly scalable and high performance applications using loklak. The API is available on github, feel free to open an issue in case you find a bug or need an enhancement

Loklak API SDK Now supports golang

Growing list of API libraries for loklak

We are very happy that the list of API libraries for loklak is constantly growing. Please check out the following project to create applications with loklak:

Growing list of API libraries for loklak