Contact at or 8097636691/9323040215
Responsive Ads Here

Friday, 23 February 2018

Traffic Pattern-Based Content Leakage Detection for Trusted Content Delivery Networks (2014)

Traffic Pattern-Based Content Leakage

 Detection for Trusted Content Delivery 

Networks (2014)

Due to the increasing popularity of multimedia streaming applications and services in recent years, the issue of trusted video delivery to prevent undesirable content-leakage has, indeed, become critical. While preserving user privacy, conventional systems have addressed this issue by proposing methods based on the observation of streamed traffic throughout the network. These conventional systems maintain a high detection accuracy while coping with some of the traffic variation in the network (e.g., network delay and packet loss), however, their detection performance substantially degrades owing to the significant variation of video lengths.In this paper, we focus on overcoming this issue by proposing a novel content-leakage detection scheme that is robust to the variation of the video length. By comparing videos of different lengths, we determine a relation between the length of videos to be compared and the similarity between the compared videos. Therefore, we enhance the detection performance of the proposed scheme even in an environment subjected to variation in length of video. Through a testbed experiment, the effectiveness of our proposed scheme is evaluated in terms of variation of video length, delay variation, and packet loss.
A crucial concern in video streaming services is the protection of the bit stream from unauthorized use, duplication and distribution. One of the most popular approaches to prevent undesirable contents distribution to unauthorized users and/or to protect authors’ copyrights is the digital rights management (DRM) technology. Most DRM techniques employ cryptographic or digital watermark techniques. However, this kind of approaches have no significant effect on redistribution of contents, decrypted or restored at the user-side by authorized yet malicious users.
Moreover, redistribution is technically no longer difficult by using peer-to-peer (P2P) streaming software.  Hence, streaming traffic may be leaked to P2P networks.
In this paper, we focus on the illegal redistribution of streaming content by an authorized user to external networks. The existing proposals monitor information obtained at different nodes in the middle of the streaming path. The retrieved information is used to generate traffic patterns which appear as unique waveform per content, just like a fingerprint.
ü  These technologies enhance the distribution of any type of information over the Internet.
ü  The traffic pattern generation process performed in conventional methods.
1.    Video Leakage setting
2.    Leakage Detection measures
3.    Pattern Generation
4.    Pattern Matching
5.    Leakage Detection Criterion
Video Leakage setting:
Due to the popularity of streaming delivery of movies, development of P2P streaming software has attracted much attention. These technologies enhance the distribution of any type of information over the Internet. First, a regular user in a secure network receives streaming content from a content server. Then, with the use of a P2P streaming software, the regular yet malicious user redistributes the streaming content to a non regular user outside its network. Such content-leakage is hardly detected or blocked by watermarking and DRM-based techniques.
Leakage Detection measures:
Throughout the video streaming process, the changes of the amount of traffic appear as a unique waveform specific to the content. Thus by monitoring this information retrieved at different nodes in the network, content-leakage can be detected. The topology consists of two main components, namely the traffic pattern generation engine embedded in each router, and the traffic pattern matching engine implemented in the management server. Therefore, each router can observe its traffic volume and generate traffic pattern. Meanwhile, the traffic pattern matching engine computes the similarity between traffic patterns through a matching process, and based on specific criterion, detects contents leakage. The result is then notified to the target edge router to block leaked traffic.
Pattern Generation:
We describe the traffic pattern generation process performed in conventional methods. Traffic pattern generation process is based on a either time slot-based algorithm or a packet size-based algorithm.
Time slot-based algorithm is a straightforward solution to generate traffic patterns by summing the amount of traffic arrival during a certain period of time, t. In case some packets are delayed, they may be stored over the following slot, xi+1, instead of the primary slot, xi. Therefore, delay and jitter of packets distorts the traffic pattern, and as a consequence, decreases the accuracy in pattern matching. Moreover, time slot-based algorithm is affected by packet loss.
Packet size-based algorithm defines a slot as the summation of amount of arrival traffic until the observations of a certain packet size. This algorithm only makes use of the packet arrival order and packet size, therefore is robust to change in environment such as delay and jitter. However, packet size-based algorithm shows no robustness to packet loss.
Pattern Matching:
In pattern recognition, the degree of similarity is defined to be the similarity measure between patterns. The server-side traffic patterns represent the original traffic pattern. The fundamental method to quantify the similarity of traffic patterns called cross-correlation matching algorithm, consist of computing the cross-correlation coefficient, which is used as a metric of similarity between the various traffic patterns. Before calculating the similarity between the partial pattern XU and the server-side pattern YU.
Another pattern matching algorithm is the dynamic programming (DP) matching based on the DP technique. DP matching utilizes the distance between the compared patterns in U-dimensional vector space as metric representing their similarity.
Leakage Detection Criterion:
The cross-correlation matching algorithm is performed on both the traffic patterns generated through time slot-based algorithm and those generated through packet size-based algorithm. The similarity data obtained from the matching of time slot-based generated traffic patterns are considerably small and their distribution is considered to be normally distributed around zero, since the distribution of cross-correlation coefficient values of two random wave-forms is approximated to a normal distribution. On the other hand, the DP matching algorithm is performed on traffic patterns generated through packet size-based algorithm. Therefore, a fixed predefined value is used as the decision threshold. Whether or not patterns are similar is decided by comparing the distance computed through DP matching with the decision threshold, i.e., the distance less than the threshold indicates that the compared traffic patterns are similar.

Ø System                          :         Pentium IV 2.4 GHz.
Ø Hard Disk                      :         40 GB.
Ø Floppy Drive                 :         1.44 Mb.
Ø Monitor                         :         15 VGA Colour.
Ø Mouse                            :         Logitech.
Ø Ram                               :         512 Mb.
Ø Operating system           :         Windows XP/7.
Ø Coding Language :         JAVA
Ø  IDE                     :         Netbeans 7.4
Ø Database              :         MYSQL
Hiroki Nishiyama, Desmond Fomo, Zubair Md. Fadlullah, and Nei Kato “Traffic Pattern-Based Content Leakage Detection for Trusted Content Delivery Networks” IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,VOL. 25,NO. 2,FEBRUARY 2014.

No comments:

Post a Comment