Select language:
+1 917 847 8801
Projects

Tenders and information aggregators

 
May 14, 2015

Tenders and information aggregators

 

During the time we have spent in this field, our company has developed hundreds of site parsers. It’s no secret that many websites restrict access to reading information: a limited number of requests over a given time period, limiting access based on IP addresses, a sophisticated page layout as well as making regular changes to the website’s structure. The developers have many counter-measures in their arsenal, including the use of proxy servers, as well as heuristic and adaptive algorithms. We usually code such projects on a time-based payment scheme for the reasons described above. We have developed a product which can be used to create algorithms for advertising agencies, the monitoring of mentions, specialized search engines, utilities for sales departments, parsers for goods and services and analytical and statistical utilities. EDISON is used in a sales department as a tool for collecting information from a dozen sources each day, presenting it in a clear format for managers.

 

Discounted clothes website

  
Client: Online clothes store
Description:

Redesign of the website of a German online clothes store aggregator. Since all online stores offer heterogeneous data, it was necessary to program an algorithm to automatically sort goods into the shop’s categories. Functional blocks and administrative panels were also created to suit the client’s needs during the website’s redesign. The working speed of the “heaviest” parts of the site was also optimised using caching technology. Technologies used: PHP, MySQL, XML, ABO.CMS, Memcached.

 

Company sales tool

  
Client: EDISON
Description:

Development of a tool for gathering internet content and keeping track of website updates. This tool works like a service which gathers information from the internet according to a timetable, at predetermined intervals and using given set of templates. Data is collected via connectors (RSS, blog, XML, webpages, etc.). The tool analyses content in each given connector using preset criteria. These criteria act as a set of complex filters. Results gained from content analysis appear in the form of a list and can be automatically sent out by email. This solution has been used to create a service for monitoring mass media and internet content thematically. Technologies used: ASP.NET, C#, SQL.

 

Magelan is an information portal for state tenders and commercial procurements

  
Client: Tenders aggregator
Description:

The service, which is updated each day, has a database of more than 20,000 bidding offers. The database is sorted by sector and region. Users are offered information search tools and a subscription to tender events according to chosen filters. Technical site maintenance is carried out regularly. Parsers for collecting tenders from additional sources are being continually developed and supported. An algorithm has been developed to automatically filter tenders by name based on headline editing rules. A cross-platform application, Skiper, was created for interaction with the portal in offline mode.

 

News and new tenders system tray notification utility

  
Client: Tenders aggregator
Description:

An application for quick access to tenders and auctions on the Magelan website. The system is intended to provide simultaneous access for several thousand local network and internet users. The project was coded with C++, Qt and included the following components:

  • Cross-platform client GUI running on Windows, Linux and MacOS for receiving mailing lists of bids and morphological search results using TCP/IP
  • Local SQLite database for storing search history and working in offline mode
  • Web-interface with REST query architecture