SSL/TLS interesting facts

The world of security is so vast. But it never failed to amaze me. I am excited about this journey of getting to know security in more details. I made a recent presentation about SSL at ERNW as part of my training. Even though I learned the protocol before, I realised there is much more to the protocol that I knew. I feel the area of SSL/TLS is beyond a mere protocol. It has become a new standard that a huge part of internet is relying on.  I really liked the book “Bulletproof SSL and TLS” by Ivan Ristic from SSL labs. I got the book signed by the author during the last Blackhat EU. But I never found enough time to read through the details. It was a great experience reading the book.

SSL/TLS comes between the application layer and transport layer in the network stack. It usually comes as an encapsulation of the application layer protocol like HTTP. These days, it is almost used in every field involving the internet, like web browsing, emails, instant messaging, VoIP, internet faxing and so on. The major idea behind SSL/TLS is to prevent man in the middle(MITM) attacks. There are numerous possibilities of how and why an MITM occurs. It could be individual hackers trying to steal bank pins. It could even be Government agencies trying to spy on their people and people of other nations. MITM could be categorized as active attacks and passive attacks. Active attacks are mainly aimed to trick authentication and trying to impersonate another person. Passive attacks are based on capturing network traffic and analysing the packets. This could lead to information disclosures. Sometimes, even encrypted traffic is analysed to find weakness in encryption used. In order to avoid the different types of man in the middle attacks, SSL was designed. It provides confidentiality, integrity, authentication and non-repudiation.

The protocol in itself is very interesting to learn. There are a bunch of attacks on SSL/TLS. It is also interesting to know about the defense mechanisms and evolution of TLS from 1996 until 2016, 20 years of growth. You can check the presentation slides for more details.

Read more.


Every application that requires to identify its users needs a security mechanism to keep track of logins and perform access control. The world of web is not safe and it is a necessity to have an authentication mechanism for every application that you might want to build/use.

There are mainly three kinds of authentication.

  • Knowledge based : something you know. Eg:- password, the TAN send to you via phone, passphrase, etc .
  • Ownership based : something you have. Eg:- smart cards, yubikeys, certificates.
  • Biometric : something you are. Eg:- fingerprint, iris

In a system that requires a strong security is highly recommended that you implement atleast two means of the above categories. It is generally called multi-factor authentication. On the other hand, there are 2-step verification / similar approaches that have two authentication mechanisms, but both falling in the same category. The google 2-step verification is an example for this.

Web applications makes use of various authentication protocols mainly basic access authentication, digest access authentication, form based authentication and OAuth. You can read in detail about the protocols in here.


I recently got an opportunity to present about Heartbleed at my new work place ERNW. I took some time to do a detailed study about the vulnerability.  I am quite amazed by its simplicity when compared to its huge impact. I know there are plenty of posts about Heartbleed. It is one of the super popular attack of recent times. Here is a brief summary of the key points.

Heartbleed is an implementation vulnerability on the subprotocol Heartbeat which is an extension to support “Keep-alive” functionality in TLS. This functionality aims to check if the other party involved in the communication is still available or not. This is mainly aimed at DTLS that works on top of protocols like UDP that doesn’t have any flow management unlike TCP. An attacker can receive upto 64KB of data leaked with one single heartbeat request. Also, as the heartbeat request is considered genuine, there are no traces in any of the logs. The heartbeat protocol could be advertised by both the client as well as the server. This means that the vulnerability could be used to attack either the client or the server.

What exactly is the vulnerability?

The Heartbeat extension mentioned above allows to send request with variable payload size. This means that an attacker can change the size of the payload that it needs from the server. If the size is greater than the default size, the server would end up leaking other content from the memory. There could be sensitive information within the memory which might get leaked.

Why is Heartbleed an implementation error?

Heartbleed vulnerability is found only in OpenSSL versions from OpenSSL 1.0.1 to 1.0.1f. Rest of the versions are secure. As per the RFC of TLS, a Heartbeat request could be sent only after the client as well as the server completes the handshake. Unfortunately, the vulnerable versions of OpenSSL allowed the Heartbeat request soon after the Client and Server Hello. This means, even before the encryption begins, a rogue client/server can give a malicious heartbeat request that would result in sensitive data leak.

How can we fix the issue?

Upgrade the OpenSSL version to 1.0.1g or above. In case of compromise, make sure that the certificates are revoked and new keys are issues. This is because there is a great chance that your key is leaked. Similarly, user credentials should also be changed to ensure that an attacker didn’t steal it. You could also remove the heartbeat extension as a temporary fix.

If you like to read more, please find the PDF : Heartbleed.

Getting started with network security monitoring

Network Security Monitoing (NSM) can be broadly classified into three areas being collection, detection and analysis [11]. There are plenty of tools that are designed exclusively for each field. Tools like tcpdump [14], wireshark [13], tshark[16], DNSstats [15], Bro [17], Chaosreader [18] come under collection tools. Their main job is to collect data from an interface and save as a dump. They are also called sensor tools. The next category is detection, which consists of two categories: signature based as well as behaviour based detections. These are tools that are used for detecting malicious behaviours/signatures in pcaps. Snort [19] and suricata [20] are the main signature based detection tools available. They work mainly based on the signature hashes developed by analysing a malware. The main inefficiency with this approach is that only ‘known’ malwares are detected. Zero days and new versions of previous malwares can easily evade such detections. Behaviour based detection are discussed in detail in the following sections. The third category analysis comes to play after an incident (security breach) is reported. These tools allows more deep packet analysis and wide inspection of packet data allowing marking, easy traversal and so on. Sguil [21], squert [22], wireshark and ELSA [23] are examples in this category. In total an NSM tool should contain a collector, a detector and then an analysis interface that allows to analyse the data. The research problems associated with each category is different. My analysis was mainly on collection and detection.


The two main questions to ask here is 1) What data needs to be collected (indexing, filters used), 2) What line rate is supported.  Some of the important libraries to consider in this area are libpcap [24] (winpcap for windows) used in Tcpdump & tcpdump-netmap, n2disk10g [25] from ntop, and FloSIS [2].  FloSIS [2] and n2disk10g [25] follow a different approach compared to libpcap. The first two are designed to keep up with multi-Gigabit speeds on commodity hardware (10Gbps). The design make efficient use of  parallelism capabilities in CPUs, memory and NIC. They also maintain two stage indexes in real time to increase query response (bloom filters). Libpcap, being the most used library is not efficient with high speed networks. (works well with 400-500 Mbps rate). But libpcap is the best library available for free and ready to use. N2disk10g [25] requires a license. FlosSIS is not open source and is not available.


There are plenty of research based on statistical data and session data. Below I am discussing only those areas that are relevant to HTTP protocol analysis and corresponding detection mechanisms.  This is related to HTTP headers and payload analysis without deep packet inspection.

There are correlation techniques where length of URL, request type (GET/POST), average amount of data in payload, hosts that rely on the same Web-based application (which could be the C2 server) etc are used for clustering malware based on HTTP. TAMD [5] (traffic aggregation for malware detection) talks about a technique on hashing blocks of payload and performing matches to identify correlations. Also capturing syntactic similarities in the payload. But this is applied in spam emails and not application data bytes.  PAYL [6] describes about a payload inspection technique based on profiling normal payloads. It calculates deviation distance of a test payload from the normal profile using a simplified Mahalanobis distance. A payload is considered as anomalous if this distance exceeds a predetermined threshold. A very similar approach is by SLADE: Statistical PayLoad Anomaly Detection Engine as used in BotHunter [7]. Here they use n-byte sequence information from the payload, which helps in constructing a more precise model of the normal traffic compared to the single-byte as used in PAYL [6]. Another approach is where the payload of each HTTP request is segmented into a certain number of contiguous blocks, which are subsequently quantized according to a previously trained scalar codebook [3].

I hope you enjoyed the reading. More on network security monitoring later.


[1] Behavioral Clustering of HTTP-Based Malware and Signature Generation Using Malicious Network Traces

[2] FloSIS: A Highly Scalable Network Flow Capture System for Fast Retrieval and Storage Efficiency

[3] Measuring normality in HTTP traffic for anomaly-based intrusion detection


[5] Classifying HTTP Traffic in the New Age

[6] K. Wang and S. Stolfo. Anomalous payload-based network intrusion detection. In Proceedings of the International Symposium on Recent Advances in Intrusion Detection (RAID), 2004.

[7] BotHunter: Detecting Malware Infection Through IDS-Driven Dialog Correlation

[8] Know your digital enemy

[9] Comparison on dump tools:


[11] Sanders. Chris, Smith. Jason,  Applied network security monitoring- collection, detection, and analysis.

[12] Bejtlich. Richard, The practise of network security monitoring.

[13] Wireshark:

[14] Tcpdump:

[15] Dnstats:

[16] Tshark:

[17] Bro:

[18] Chaosreader:

[19] Snort:

[20] Suricata: ttps://

[21] Sguil:

[22] Squert:

[23] ELSA:

[24] Libpcap:

[25] n2disk:

[26] TRE:

[27] A. K. Jain, M. N. Murty, and P. J. Flynn. Data clustering: a review. ACM Comput. Surv., 31(3):264–323, 1999

Study of GhostNet

I wrote a report about GhostNet for one of my coursework. GhostNet is an advanced persistant threat with origin as China. Thought it might be interesting read for those who like to learn about advanced persistence threats in general and about GhostNet as a case study.

Abstract: The report discusses the history, motivation, operation and detection of an advanced persistent threat (APT) called GhostNet. GhostNet originates from China with one of the main target as Tibet. The history and political background about the conflict between Tibet and China for more than 50 years is a strong motivation for the emergence of the threat. Even though GhostNet began with Tibetan organisations as one of the main targets, the attack became widespread to about 103 countries infecting machines in governmental organisations and NGOs. The attacks makes use of the typical phishing email technique. The email contains contextually relevant information that makes it unsuspicious for the user to fall into the trap. The attack also includes the use of a command and control server. The report concludes with relevant detection techniques and countermeasures for protecting a system against GhostNet and similar advanced persistent threats.

PDF: GhostNet.

Enjoy reading!:)

The need for cyber awareness

During my summer vacation last year, I joined the Ayudh youth camp in a small village in Brombachtal, near Frankfurt, Germany. It was a summer camp for young people with various workshops in areas ranging from sports, music, arts and science. I was very excited to be part of the anti-cyber bullying workshop. It is very important to be aware of various cyber threats to stay safe online and maintain privacy. This is even more important for children, who are always connected to the internet with lot of gadgets all the time without being aware of potential dangers.
As a part of the camp, we went to the Theodor Litt Schule, a middle school in the nearby town of Michelstadt. I joined the four day workshop for the school kids conducted by youngsters like me (20-30 age group). I must say that it was a great experience to be part of the Ayudh group. Lucia Rijker, former world champion in boxing was also part of the workshop. Lucia is called by the press as “The Most Dangerous Woman in the World”. It was indeed an awesome experience to spend time with her. She conducted sessions on “respect” stressing on the need for self respect as well as respect for others. Some of the discussions and informal sessions were so invaluable. Students opened up and shared private incidences from their lives. I got to know that the incidents happening on the cyber space leave deep imprints on the minds of children. Online identity is given more importance than necessary, having a huge impact on the self confidence of a child. There were multiple incidents where children use cyber space as a platform for bullying a friend/classmate by abusing with public posts/pictures/videos. There are even instances where the victim ended up suiciding unable to bear the shame, due to their videos/post becoming viral. Before we share or click like on a funny video of someone (could be even a stranger), we must think twice what we actually gain out of it. If the content makes fun of someone, we can decide to participate in harassing somone or not with a click. It is very important to realise this sense of responsibility for every action we do online. The workshop in total served as an enriching experience awakening respect for myself, fellow friends and others. We even made a small act (drama) to spread awareness against cyber bullying.


After the camp, I went to India, my home country. I got to know about an incident near my village. A 9th grade child committed suicide due to a fake relationship that he made online. I felt very sad about the incident. This tragic incident shows that there is an important need to make children aware about their activities online. I was very happy when I got a chance to go to Amrita Vidyalayam school, Kollam, Kerala and talk about the relevance of cyber security and privacy to about 150 school kids. In both the schools, in Frankfurt and Kollam, at different sides of the world, I noticed that it is a growing concern among teachers and parents about their kid’s safety online. It is alarming to know about more and more bad incidents on cyber bullying/cheating happening around the world. I will make use of every opportunity that I get to spread awareness regarding this issue. It is a collective responsbility of everyone to be aware of the pitfalls in cyber space. Being a student in computer security, I find the relavance of safety and security online as a strong motivation to continue in my field. Computer securty is a lot of fun as far as the technical studies are concerened. But it gives immense satisfaction to study computer security for me mainly because,  it also serves a strong social cause for serving the society today.

Why kids should learn computer science

I dedicate this post to all high school kids out there who like to know more about the science of computing.

Computer Science has transformed the way we live by influencing all major domains. What was science fiction yesterday, is science fact today, thanks to computer science. The boons of computing are all around us, from automatic translation of documents to driverless cars,  to smart phones that understand our speech and life patterns. Facebook or WhatsApp or Twitter has become an integral part of our social lives.  Travelling and exploring new places took a new dimension with Google maps in everyones pockets. Even health sector is deeply influenced by the new health apps. Education is yet another field. You can take courses offered by professors from world class universities like Stanford, MIT or Cambridge, sitting at your home. Think about movies. Do you remember how amazed you were to see Harry Potter with the magic wand, or Kung Fu panda showing his cool Kung Fu moves? How about seeing a drone flying in the sky, bringing you the Pizza that you ordered at your doorstep?

We live in a new digital era and we all benefit from computer science in one way or the other. There is a huge variety of areas that one could specialise once you study computer science. You could be a designer, playing around with layouts, colours, and user interactions. You could be a security engineer, where you find out how to attack a system and come up with strong defense techniques. You could be a game developer who makes all those cool games that you play  within your tablet.  You could be a software engineer who is interested in making algorithms for running robots, connecting large networks, faster data management etc.  Maybe you could be the one who finally builds the world’s first intelligent machine close to the human brain or even superior! These are just a few options out of hundreds of other cool things that you could do with computer science!

When I was fourteen, I had deep interests in arts as well as science. I used to love music, paintings, logic, and programming. When the time came for applying to college, I was confused what to study. I believe I made the right choice by choosing  computer science. Learning computer science enables you to have an outlook at the world without boundaries between arts and science.  The attitude of a maker and an artist is what is needed in the field of computer science. We can not  easily predict the jobs that will be needed in the next decade. So we need to think of what kind of a mindset would prepare us for an exponentially changing world. In the future,  most of the jobs that require repetition will disappear as traditional jobs are getting redefined.

As we see, the world is changing at a rapid speed with the influence of technology and the huge amount of information flow. We need designers, artists, architects, civil servants, teachers who all have a grasp of computing principles and data analysis. Knowing computer science is not just a career option; it is a need, and an added advantage to your skill set, in any field you chose to go in future. It expands your thinking, and it helps you to become a problem solver who is dynamic enough to adapt to the changing world and who is ready to learn new skills and collaborate with each other.

Note: Some thoughts are inspired from a discussion with my friend @karmakomik.