Tenders and information aggregators

Tenders and information aggregators

May 14, 2015

During the time we have spent in this field, our company has developed hundreds of site parsers. It's no secret that many websites restrict access to reading information: a limited number of requests over a given time period, limiting access based on IP addresses, a sophisticated page layout as well as making regular changes to the website's structure. The developers have many counter-measures in their arsenal, including the use of proxy servers, as well as heuristic and adaptive algorithms. We usually code such projects on a time-based payment scheme for the reasons described above.

Online clothes store

Discounted clothes website

Redesign of the website of a German online clothes store aggregator. Since all online stores offer heterogeneous data, it was necessary to program an algorithm to automatically sort goods into the shop's categories. Functional blocks and administrative panels were also created to suit the client's needs during the website's redesign. The working speed of the “heaviest” parts of the site was also optimised using caching technology.


Company sales tool

Development of a tool for gathering internet content and keeping track of website updates. This tool works like a service which gathers information from the internet according to a timetable, at predetermined intervals and using given set of templates. Data is collected via connectors (RSS, blog, XML, webpages, etc.). The tool analyses content in each given connector using preset criteria. These criteria act as a set of complex filters. Results gained from content analysis appear in the form of a list and can be automatically sent out by email. This solution has been used to create a service for monitoring mass media and internet content thematically.

Tenders aggregator

Information portal for state tenders and commercial procurements

The service, which is updated each day, has a database of more than 20,000 bidding offers. The database is sorted by sector and region. Users are offered information search tools and a subscription to tender events according to chosen filters. Technical site maintenance is carried out regularly. Parsers for collecting tenders from additional sources are being continually developed and supported. An algorithm has been developed to automatically filter tenders by name based on headline editing rules. A cross-platform application, Skiper, was created for interaction with the portal in offline mode.

Tenders aggregator

News and new tenders system tray notification utility

An application for quick access to tenders and auctions on the Magelan website. The system is intended to provide simultaneous access for several thousand local network and internet users. The project was coded with C++, Qt and included the following components:

  • Cross-platform client GUI running on Windows, Linux and MacOS for receiving mailing lists of bids and morphological search results using TCP/IP
  • Local SQLite database for storing search history and working in offline mode
  • Web-interface with REST query architecture
Skiper main mindow
Skiper search
Skiper filters