Results for category "Project Ideas"

2 Articles

Side project: A tool for preparing and calculating quotes with Neo4j

One part of my work is to inspect request for quotes (RfQs), gather customer requirements, estimate the effort and prepare a written offer. Most of our customers do not want to pay a separate bill for every sprint or user story but get a number or range for the total costs of the project. A big problem is the complexity of our projects. We do not only do software development but also do user interface design and application hosting.

In the last ten years I have tried and seen a lot of methods to calculate the estimated costs of software development projects: Function Point Analysis, COCOMO, Microsoft Excel, using modificators considering project management, risk and other factors, using tool combinations like JIRA or Confluence. In the end every method I have tried has disadvantages in one way or another.

This being said, one year ago I started to gather my own requirements for a cost calculation tool which fits in my workflow and should reduce my work:

  • I need a user interface to capture tasks (user stories, subtasks) and required material (server, licenses, hosting costs, etc.). Each of these items can be valued (e.g. effort in hours, material cost in Euro etc.).
  • Each of the items can be grouped together to form components. Each component can contain components itself.
  • Changing the cost or effort of any item should result into a recalculation of parent components and the whole offer.
  • I do not want to contaminate JIRA with estimations. JIRA is a project management tool and not a tool for estimating the cost of projects. Portfolio for JIRA does also not match these requirements.
  • After I have structured the requirements and estimated their costs/efforts I want to export the offer in a distributable format (PDF, XML, …).
  • After the customer has agreed to order components/tasks of our offer, I want to export these items from the calculation tool into JIRA.

The ideal workflow would be:

  • The customer gets in touch with us, we receive the specifications.
  • I break down the specifications into components and derive the tasks and the required material.
  • Our team estimates the effort.
  • Me and my superior are defining the quality levels of each tasks and the costs. The total costs are automatically calculated.
  • We are finalizing the offer and sending the link of it to the customer.
  • The customer selects the components/tasks (user stories) he wants to buy.
  • The cost/pricing of selected components/tasks can be exported to the billing tool.
  • Selected components/tasks can be exported to JIRA.

Decisions

The whole project idea had been matured over months before I started with a first architectural draft. One big decision was choosing the DBMS to store the data. Instead of using PostgreSQL I chose Neo4j. One of the reasons for choosing Neo4j was the requirement that components can be interleaved in unlimited depth. Yes, I know Common Table Expressions (CTE) but hierarchical structures like quotes or Bill of Materials (BoMs) can be easily implemented with graph databases.

 

The current graph schema does not look exactly like this but it should give you an idea of the internal structure.

Schema of Request for Quote tool

Schema of Request for Quote tool

As you can see, containers can have an unlimited depth. For every node (task, item) multiple metrics can be defined. For performance reasons the parent’s container of a node contains the aggregated metrics of its children. The sample data can be found in the Neo4j console.

Current state

Until today I have implemented a first version of the repository and service layer in the backend. With Java’s Lambda expression I realized an internal DSL to calculate different metrics which can be based upon each other. For example I can specicy that if the amount or retail price of an item changes, both are multiplied and stored into the turnover metric for materials:

forMetrics(MetricType.AMOUNT, MetricType.RETAIL_PRICE_PER_UNIT)
	.when(ifAtLeastOneChanged())
	.then(
		announce(
			multiply(MetricType.AMOUNT, MetricType.RETAIL_PRICE_PER_UNIT, MetricType.TURNOVER_MATERIAL)
		)
	);

All metrics can be calculated and aggregated up to the root node. The service layer allows the concurrent editing of parts of the whole hierarchy (moving, adding or deleting subtrees).

Goals

There is still a lot to do but I am convinced that this project is cool and offers a real value for every person planning complex offers. I have no idea when (and if) I will ever finish the project. I am trying to build the tool as a SaaS platform so it should be relatively easy to make some money with it.

If you are interested in more details, drop me a line at me[at]schakko[dot]de.

Project idea: Nostradamus AKA prophetr – a social forecasting network for experts

From time to time some programming ideas come to my mind which I can not forget. I had often started a new project but due to my limited amount of free time it is hard to finish all of them. In the next blog posts I will describe my ideas with some technical and environmental background. If you are interested in getting more information, buying the idea/source code or just motivate me, just drop me a line at me[at]schakko[dot]de. I would really appreciate your feedback.

Making prophecies and verifying their occurence

It must been around five or six years ago when I first thought about the idea of making prophecies in technical topics. I remember that I had a talk during the lunch with one of my co-workers about some IT trending topic which I had been propagated a for a few months. During this talk I said that a platform would be awesome where users can enter their prophecies and verify prophecies of other users if they had occured and how exactly the prophecy matches with the occurence.

Due to the number of prophecies made and the number of verified prophecies you could calculate the relevancy of a prophet (= a person who makes prophecies) which indicates the expert level of the prophet. A higher number of verified prophets means that you have more expertise in a given topic than other users.

Possible customers for social forecasting networks

The first intention was to have a social network for selfish reasons. Through some mechanisms, like not being allowed to change a prophecy after someone has voted for it, you would have been pinned to one statement which could be falsified or verified. If you were right, you could always use the classical phrase: “I said so.”.

In larger companies you could identify hidden champions or motivate people to be more interested in their expert knowledge. One day I had a talk with my bank account manager who were highly interested in the project because of obvious reasons. The software would allow them to evaluate the efficency of share brokers without using real money.

Another possible target group were whistleblowers or persons who wanted to make sure that a prophecy would be published on a specific date. For this I implemented some functionality to encrypt the content of the prophecy with symmetric keys. The keys could be stored on remote servers so that only the prophet was in control of when the prophecy can be published. After Snowedens revelations I instantly thought about this feature again.

 

I have to admit that the project has one big flaw: making self-fulfilling prophecies like: “I prophecy that the share price of company VWXYZ will drop in the next few days.” If you are already an expert in your area, there is a high chance that other follower will react to this prophecy and sell their shares. The share price will drop and your prophecy could be verified… You get the idea.

Technical background

At first I started with Spring MVC but after some weeks I switched to PHP/Zend Framework 1.x/MySQL. Most of the statistical computation (relevancy of prophets, influence of prophets and so on) and the social network aspect (who follows whom, which prophecies I can see) is done through database views which made the implementation inside the services really easy.
The encryption part called remote-credential-loader (RCL) is written in Node.js. RCL polls every few minutes the deposited decryption key URLs for encrypted prophecies. To a given timestamp (e.g. five minutes before releasing the prophecy) the URL must provide the AES decryption key, otherwise the prophecy is evaluated as false.

For the frontend I used Twitter Bootstrap 2.

The whole background documentation (processes, data model, computation) I had written in LaTeX (German language only).

Current status of the project

After thinking about the idea for years I finished the beta within the scope of my Bachelor project in the year 2012. The professor who belongs to the statistical faculty and who had observed the project was really impressed about it. Since December 2012 I am the owner of prophetr.de and prophetr.com which were intended to host the social network, but it is a classical 80%/20% project status. The application misses LDAP authorization and synchronization for usage in enterprise environments, the user interface and design is pragmatical and not very user friendly and so on.

A few months after I finished the Bachelor project I read an article in the c’t. If remember correctly they were from Austria and got a lot of money for building a social forecasting network like mine. This was more or less the reason why I had abandoned the project for the last two years.

Drop me a line at me[at]schakko[dot]de if you are interested in more information.