In some of my projects, I like to provide an accurate documentation. Which means that I want to have examples and documentation up to date.
So, when I'm updating a library or a service I'm using in my code, I need to manually find and replace all the text to reflect the changes.
This post describes how you can do this automatically using Github Actions, Maven and Dependabot.
When I joined Elastic (formerly Elasticsearch) it was a startup with 10 employees + the founders. As one of those first employees I was invited (with #elkie and my wife) to the NYSE event where Elastic went listed as ESTC symbol.
Some of us there (Rashid, Karel, Myself, Igor, Costin, Luca, Clinton). Yeah. You are not probably used to see us wearing a suit! :)
If you want to read again my story, itās there:
This blog post is part of a series of 3:
Importing Bano dataset with Logstash Using Logstash to lookup for addresses in Bano index Using Logstash to enrich an existing dataset with Bano In the previous post, we described how we can transform a postal address to a normalized one with also the geo location point or transform a geo location point to a postal address.
Letās say we have an existing dataset we want to enrich.
This blog post is part of a series of 3:
Importing Bano dataset with Logstash Using Logstash to lookup for addresses in Bano index Using Logstash to enrich an existing dataset with Bano In the previous post, we described how we indexed data coming from the BANO project so we now have indices containing all the french postal addresses.
Letās see what we can do now with this dataset.
Searching for addresses Good. Can we use a search engine to search?
This blog post is part of a series of 3:
Importing Bano dataset with Logstash Using Logstash to lookup for addresses in Bano index Using Logstash to enrich an existing dataset with Bano Iām not really sure why, but I love the postal address use case. Often in my career I had to deal with that information. Very often the information is not well formatted so itās hard to find the information you need when you have as an input a not so nice dataset.
What a milestone! Can you imagine how changed the company in the last 5 years? From 10 employees when I joined to more than 700 now!
If you want to read again my story, itās there:
2013: Once upon a timeā¦ 2014: Once upon a time: a year laterā¦ 2015: Once upon a time: Make your dreams come true 2016: 3 years! Time flies! 2017: 4 years at elastic! Before speaking about what happened the last 5 years for me, letās modify a bit the script I wrote last year.
This post is starting to become a long series š
Yeah! Thatās amazing! I just spent 4 years working at elastic and Iām starting my happy 5th year!
If you want to read again my story, itās there:
2013: Once upon a timeā¦ 2014: Once upon a time: a year laterā¦ 2015: Once upon a time: Make your dreams come true 2016: 3 years! Time flies! This year, I will celebrate this by writing a new tutorialā¦
In a recent post we have seen how to create real integration tests. Those tests launch a real elasticsearch cluster, then run some tests you write with JUnit or your favorite test framework then stop the cluster.
But sometimes, you may want to add existing plugins in your integration test cluster.
For example, you might want to use X-Pack to bring fantastic features such as:
Security Alerting Monitoring Graph Reporting Letās see how you can do that with Maven and Ant againā¦
This blog post is part of a series which will teach you:
How to write a plugin for elasticsearch 5.0 using Maven. How to add a new REST endpoint plugin to elasticsearch 5.0. How to use Transport Action classes (what you are reading now). How I wrote the ingest-bano plugin which will be hopefully released soonish. In this plugin, new REST endpoints have been added. In the previous article, we discovered how to add a REST plugin.
This blog post is part of a series which will teach you:
How to write a plugin for elasticsearch 5.0 using Maven. How to add a new REST endpoint plugin to elasticsearch 5.0 (what you are reading now). How I wrote the ingest-bano plugin which will be hopefully released soonish. In this plugin, new REST endpoints have been added. Imagine that you wish to add a new REST endpoint so you can send requests like:
Integration testsā¦ How do you run them?
Often, you are tempted to run services you want to test from JUnit for example. In elasticsearch, you can extend ESIntegTestCase class which will start a cluster of a given number of nodes.
public class BanoPluginIntegrationTest extends ESIntegTestCase { public void testPluginIsLoaded() throws Exception { // Your code here } } But to be honest, the test you are running does not guarantee that you will have the same result in production.
This blog post is part of a series which will teach you:
How to write a plugin for elasticsearch 5.0 using Maven. How to write an ingest plugin for elasticsearch 5.0 (what you are reading now). How I wrote the ingest-bano plugin which will be hopefully released soonish. Today, we will focus on writing an Ingest plugin for elasticsearch.
Hey! Wait! You wrote Ingest? What is that?
Ingest is a new feature coming in elasticsearch 5.
Elasticsearch 5.0 switched to Gradle in October 2015.
You can obviously write a plugin using Gradle if you wish and you could benefit from all the goodies elasticsearch team wrote when it comes to integration tests and so on.
My colleague, Alexander Reelsen aka Spinscale on Twitter, wrote a super nice template if you wish to create an Ingest plugin for 5.0.
Hey! Wait! You wrote Ingest? What is that?
Ingest is a new feature coming in elasticsearch 5.
Sounds like a cool music, right? At least this is one of my favorite tracks.
May be some of you already know that, I enjoy doing some DeeJaying for my friends.
But today, I want to speak about another kind of beats. Elastic beats!
Elastic Beats
Actually my favorite funky music track is a one from Georges Duke: Reach out! But this is another storyā¦
Beats So what are beats?
Beats are lightweight shippers that collect and ship all kinds of operational data to Elasticsearch
I gave a BBL talk recently and while chatting with attendees, one of them told me a simple use case he covered with elasticsearch: indexing metadata files on a NAS with a simple ls -lR like command. His need is to be able to search on a NAS for files when a user wants to restore a deleted file.
As you can imagine a search engine is super helpful when you have hundreds of millions files!
Some months ago, I published a recipe on how to index Twitter with Logstash and Elasticsearch.
I have the same need today as I want to monitor Twitter when we run the elastic FR meetup (join us by the way if you are in France!).
Well, this recipe can be really simplified and actually I donāt want to waste my time anymore on building and managing elasticsearch and Kibana clusters anymore.
Letās use a Found by elastic cluster instead.
This article is based on Recommender System with Mahout and Elasticsearch tutorial created by MapR.
It now uses the 20M MovieLens dataset which contains: 20 million ratings and 465 000 tag applications applied to 27 000 movies by 138 000 users and was released in 4/2015. The format with this recent version has changed a bit so I needed to adapt the existing scripts to the new format.
Prerequisites Download the 20M MovieLens dataset. Unzip it.
Recently, I got a database MySQL dump and I was thinking of importing it into elasticsearch.
The first idea which pops up was:
install MySQL import the database read the database with Logstash and import into elasticsearch drop the database uninstall MySQL Well. I found that some of the steps are really not needed.
I can actually use ELK stack and create a simple recipe which can be used to import SQL dump scripts without needing to actually load the data to a database and then read it again from the database.
Iām often running some demos during conferences where we have a booth. As many others, Iām using Twitter feed as my datasource.
I have been using Twitter river plugin for many years but, you know, rivers have been deprecated.
Logstash 1.5.0 provides a safer and more flexible way to deal with tweets with its twitter input.
Letās do it!
Letās assume that you have already elasticsearch 1.5.2, Logstash 1.5.0 and Kibana 4.0.2 running on your laptop or on a cloud instance.