Oracle’s mysql.connector for python

Oracle released a pure-python mysql connector with connection pooling support.

Create a connection pool is really easy. You can try the following snippets in ipython

import mysql.connector

auth = {
"database": "test",
"user":     "user",
"password":     "secret"
}

# the first call instantiates the pool
mypool = mysql.connector.connect(
pool_name = "mypool",
pool_size = 3,
**auth)

All the subsequent calls to connect(pool_name=”mypool”) will be managed by the pool

# this won't create another connection
#  but lend one from the pool
conn =  mysql.connector.connect(
    pool_name = "mypool",
    pool_size = 3,
    **auth)

# now get a cursor and play
c = conn.cursor()
c.execute("show databases")
c.close()

Closing the connection will just release the connection to the pool: we’re not closing the socket!

conn.close()

Generating svg from xml with python etree

Python rocks at managing xml files. Having to convert some vss shapes to odt, I just:

# vss2xhtml file.vss > file.xhtml

Then I parsed the xhtml containing multiple images with python

from xml.etree import ElementTree as ET
xml_header = """
        
        """
tree = ET.parse("file.xhtml")
# get all the images
images = tree.findall(r'//{http://www.w3.org/2000/svg}svg')

# enumerate avoids i=0, i+=1
for i, x in enumerate(images):
    destination_file = "image_%s.svg" % i
    with open(destination_file, 'w') as fd:
        fd.write(xml_header)
        fd.write(ET.tostring(x))

Statistics 101 with ipython

Today I needed to process some log files in search of some relations between data. After parsing the log file I got the following table.

data = [ 
('timestamp', 'elapsed', 'error', 'retry', 'size', 'hosts'),
(1379603191, 0.12, 2, 1, 123, 2313),
(1379603192, 12.43, 0, 1, 3223, 2303),
...
(1379609000, 0.43, 0, 1, 3223, 2303)
]

I easily converted this into a columned dict:

table = dict(zip( data[0], zip(*data[1:]) ))
{
'timestamp' : [ 1379603191, 1379603191, ..., 1379609000],
'elapsed': [0.12, 12.43, ..., 0.43],
...
}

In this way it was very easy to run basic stats:

print [k, max(v), min(v), stats.mean(v), stats.stdev(v) ] for k,v in table.items() ]

Check data distributions

from matplotlib import pyplot
pyplot.hist(table['elapsed'])

And even look for basic correlation between columns:

from itertools import combination
from scipy.stats.stats import pearsonr
for f1, f2 in combinations(table.keys(), 2):
    r, p_value = pearsonr(table[f1], table[f2])
    print("the correlation between %s and %s is: %s" % (f1, f2, r))
    print("the probability of a given distribution (see manual) is: %s" % p_value)

Or draw scatter plots

from matplotlib import pyplot
for f1, f2 in combinations(table.keys(), 2):
    pyplot.scatter(table[f1], table[2], label="%s_%s" % (f1,f2))
    # add legend and other labels
    r, p = pearsonr(table[f1], table[f2])
    pyplot.title("Correlation: %s v %s, %s" % (f1, f2, r))
    pyplot.xlabel(f1)
    pyplot.ylabel(f2)
    pyplot.legend(loc='upper left') # show the legend in a suitable corner
    pyplot.savefig(f1 + "_" + f2 + ".png")
    pyplot.close()

bashing ipython

iPython is a wonderful tool that avoids continuosly switching from bash to other utilities like bc, perl & co.

One of its limitation is the I/O redirection – at which bash is really good of. As iPython py-shell profile uses /bin/sh by default – thru os.system, I implemented a quick and dirty system replacement that diverts it into bash.

I added the following line here .ipython/profile_pysh/ipython_config.py

import os
def system2(cmd):
  pid = os.fork()
  if pid == 0:
    args = ['/bin/bash', '-c', cmd ]
    return os.execvp("/bin/bash", args)
  else:
    c_pid, status = os.waitpid(pid, 0)
    return status

print("Overriding os.system with bash")
os.system = system2

# or you can simply use subprocess if available
os.system = lambda cmd: subprocess.call(cmd, shell=True,executable='/bin/bash')

While py-shell profile runs commands for every call outside python globals(), other profiles use the `bang` syntax.
Eg. ! ls -l

To work it out too, I just changed the following line in
/usr/lib/python2.7/dist-packages/IPython/utils/_process_common.py
71 p = subprocess.Popen(cmd, shell=True,
72 executable='/bin/bash',
73 stdin=subprocess.PIPE,

python dsadmin module: 389 made easy

Rich Megginson, the 389 Keymaster, created this wonderful python library https://github.com/richm/scripts

To create a new root suffix just

credential = {
'host': 'localhost,
'port': 10389,
'binddn':
'cn=directory manager',
'bindpw': 'secret'
}

Connect to an already-existing instance passing our dict variable as function arguments
conn = DA(**credential)

Create a backend database
backend_name = conn.setupBackend('o=addressbook1')

Bind it to an entry
conn.setupSuffix(suffix='o=addressbook1', bename=backend_name)

And create the entry

e = Entry('o=addressbook1')
e.setValue('objectclass', ['top','organization'])
e.setValue('o','addressbook1')
conn.add(e)

Easy, isn’t it?

Json and Django: mythe and music in the dojo

Json have been already presented by Fabio Fucci on this blog. Django is a python framework for web application, which supports json thru a library.

We’re going to create a simple request-response application:

  1. request is issued by dojo.xhrPost;
  2. response is managed by python.

Creating a request means creating a json string to send to the server.

/* create the variable to post */
var arguments={'user':'ioggstream','status':'at work'};

dojo.xhrPost({
url: '/json/post',
handleAs: "json",
/* serialize the argument object to a json string*/
postData: dojo.toJson(arguments),
load: function(data) {
alert(data.result);
},
error: function(error) {
alert("An unexpected error occurred: " + error);
}
});

Now that the request is issued, and the postData is a json string, we use python to de-serialize the string to a python object.
The dict() python class – aka dictionary – is an associative array. The django.simplejson can serialize and deserialize object using dict().

Let’s see the code

from django.utils import simplejson
from google.appengine.ext import webapp

class JsonService(webapp.RequestHandler):
def post(self):
logging.info("manage a POST request")
# parse request body to a python dict() object
args = simplejson.loads(self.request.body)
# returning request in a verbose mode
# creating a dict() object with default message
message = {'result':'The request misses user and/or       status'}
if 'user' in args and 'status in args:
message['result'] = "The request user %s status is %s " % (args['user'], args['status'])
# return the serialized object message
return simplejson.dumps(message)

Sneaking thru webservices like a python

After the brief presentation of php webservices, here’s how to play with SOAP and Python:

>>> from SOAPpy import SOAPProxy
>>> url = 'http://rpolli@babel.it:password@horde.example.com:80/rpc.php'
>>> n = 'urn:horde'
>>> server = SOAPProxy(url, namespace=n)     1
>>> server.config.dumpSOAPOut = 1            2
>>> server.config.dumpSOAPIn = 1
>>> calendars = server.calendar.list()    3

Steps are easy:

  1. set the namespace of the xml request: this is compulsory
  2. set I/O to verbose
  3. execute the call

You can list wsdl method with a similar way:

>>> from SOAPpy import WSDL          
>>> wsdlFile = 'http://www.xmethods.net/sd/2001/TemperatureService.wsdl')
>>> server = WSDL.Proxy(wsdlFile)    
>>> server.methods.keys() 

Examples are taken from: 
http://www.diveintopython.org/soap_web_services/index.html