<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>golang on diamondcutter</title>
    <link>https://yamin.netlify.app/tags/golang/</link>
    <description>Recent content in golang on diamondcutter</description>
    
    <generator>Hugo -- gohugo.io</generator>
    <lastBuildDate>Thu, 12 Nov 2020 00:00:00 +0000</lastBuildDate><atom:link href="https://yamin.netlify.app/tags/golang/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Kafka protobuf producer</title>
      <link>https://yamin.netlify.app/posts/kafka-protobuf-producer/</link>
      <pubDate>Thu, 12 Nov 2020 00:00:00 +0000</pubDate>
      
      <guid>https://yamin.netlify.app/posts/kafka-protobuf-producer/</guid>
      <description>&lt;p&gt;In an event-driven architecture, requirements for a flexible, reliable schema contract between producer
and consumer application is required. Using &lt;a href=&#34;https://developers.google.com/protocol-buffers&#34;&gt;protocol buffers&lt;/a&gt; allows for an extensible mechanism for serializing structured data. You are able to define the contract once and use the generated source code to write and read data using the language of your choice. In this article, I&amp;rsquo;ll discuss the steps to push a serialized protobuf message into kafka and a way to verify that by reading the message from kafka into a console&lt;/p&gt;</description>
    </item>
    
  </channel>
</rss>
