12.0 How to share your Spark ( RPiSparkModule ) with others?
We called each application as a Spark, which is a subclass that inherits from RPiSparkModule. We just to implement two RPiSparkModule member functions setup() and run() to complete a Spark application.
At the same time, this Spark can combined with other Sparks to become a new application. We can share the Sparks module with others and others can also shared with us too. In this way, we have the utilization of software modules, allowing us to build Raspberry Pi GPIO applications more quickly.
In this chapter we describe how to share the Spark module.
There are two ways to share your Spark module:
Just copying of Spark module files to others is simple, but it be cumbersome for send many people.
Build Spark's PyPi (Python Package Index) distribution package and distribute it to people around the world through the PyPi repository so that people around the world can easily download, install and use your Spark module.
Next, let's show you the steps and methods:
PyPi（Python Package Index）
We shared the Spark module with Python users around the world through PyPi (Python Package Index).
See the manual for more detailed installation and distribution packages. SetupTools：
Build Spark distribution package
Let's make a simple example to show you how to build and publish a Spark package.
First we create directory structure of PyPi distribution.
mkdir spark_demo mkdir spark_demo\SpakrDemo touch spark_demo\SpakrDemo\__init__.py
Then we create a simple Spark module named: RPiSparkDemo.py under the path: spark_demo\SpakrDemo and enter the following:
from JMRPiFoundations.Skeleton.RPiSparkModule import RPiSparkModule class RPiSparkDemo(RPiSparkModule): def setup(self): pass def run(self): print("This is RPi-Spark Module Demo.....")
Save file. Next, we create a file named: setup.py in the path: spark_demo and enter the following:
from setuptools import setup, find_packages classifiers = [ 'Development Status :: 4 - Beta', 'Operating System :: POSIX :: Linux', 'License :: OSI Approved :: MIT License', 'Intended Audience :: Developers', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Topic :: Software Development', 'Topic :: System :: Hardware', 'Topic :: System :: Hardware :: Hardware Drivers' ] keywords = ( "Your keywords of Spark" ) desc = "This is a spark package demo" pagekages = find_packages() setup ( name = 'RPiSparkDemo', version = '1.0.0', author = 'Kunpeng Zhang', author_email = 'youremail@your_domain_name.com', description = desc, long_description = desc, platforms = ['Linux'], license = 'MIT', classifiers = classifiers, keywords = keywords, url = 'your_url', dependency_links = , install_requires = , packages = pagekages )
Publish Spark package
For publish Spark package to people around the world, you first need to register an account with PyPi and create a ~/.pypirc file in your local disk. Configure the pypi server and account information for PyPi access.
[distutils] index-servers = pypi [pypi] username:your_pypi_username password:your_pypi_password
Then register the Spark package
python setup.py register
You can upload your Spark package to PyPi after above command was successful:
python setup.py sdist upload
Install and use the Spark package
We can install our Spark package via pip command after Spark package released on PyPi.
sudo pip install RPiSparkDemo
Once installed, we can integrate this Spark package into our new project. eg. :
from JMRPiFoundations.Skeleton.RPiSparkProvider import initSpark from JMRPiFoundations.Devices.rpi_spark_z_1_0_0 import RPiSparkConfig as mySparkConfig from JMRPiSparks.Menu.RPiSparkMenu import RPiMenu mySpark = initSpark() myMenu = RPiMenu(mySparkConfig, mySpark) myMenu.run()
Save as demo.py and execute the following command in the terminal:
At this time you should see the following text on the screen:
$>This is RPi-Spark Module Demo.....