Cookies file not working

Can anyone help me. I've tried numerous times to get this command to work. I've used numerous cookie extensions and ensured I'm using the Netscape cookie format. But no matter what I do I continue to receive this same error:

C:\WINDOWS\system32>youtube-dl --cookies cookies.txt -f best AND NAME OF VIDEO

[DiscoveryPlus] VIDEO NAME: Downloading JSON metadata ERROR: This video is only available for registered users. You may want to use --cookies.

I have tried a couple cookies and here are a couple of the headers:

# Netscape HTTP Cookie File
# This file was generated by EditThisCookie


# Netscape HTTP Cookie File
# This is a generated file!  Do not edit.

Edit: Solved. The cookies document wasn't working in the PATH folder where I had youtube-dl. I tried it in its own directory and it worked firs try.

👍︎ 5
📰︎ r/youtubedl
👤︎ u/greatryry
📅︎ Feb 22 2021
🚨︎ report
Ruby Fortune 100 gratis spins and $1000 free bonus (register)

Ruby Fortune Gratis Spins and Free Bonuses

Join Ruby Fortune Casino and play 100 gratis spins after sign-up. Next, get a 100% welcome bonus on your first 3 deposits. All in all, you are entitled to 100 free spins on Sugar Parade and $750 free credits to play any game.

>> Get Your Free Spins Bonus <<

Ruby Fortune Casino Online Review

Fair. Easy. Safe. Fun. These four words were printed along the top of the Ruby Fortune Casino when we made our first virtual trip to the site. Whether these four words were accurate descriptors or not was something we planned to find out. Going into our review, all the information we had was that the Ruby Fortune Casino was part of the Palace Casino Group that had been around since the early 2000’s.

Initially, we expected a carbon copy of the other Palace Group sites we had reviewed, but to our surprise (and a pleasant surprise at that) the Ruby Fortune looked to be significantly different than the other Palace Casino Group sites. Yes, we’re talking about the look and feel but also at first glance it looked like the game selection, and a few of the more integral parts of the site were also different (and much better).

The biggest difference that jumped out to us was the number of games to choose from and the quality of the games. In other previous Palace Casino Group reviews, the game selection was good, not great, and the quality of the games was mediocre with no branded games. Immediately, we saw Game of Thrones and Jurassic Park and a ton more branded games from Microgaming jumping off the page.

Honestly, it looks like Ruby Fortune may be the bread winner out of the Palace Casino Group websites where they spend all their money and have the best games and amenities. To be sure of this, we, of course, needed to dig into the site and give it a full review. As always with our reviews, we’re not here to write a positive puff piece that aims to make you feel good. We’re here to give you the down and dirty on the site whether that’s a lot of good, a lot of bad, or even a lot of ugly.

If you’ve ever read any of our other reviews, you know that we don’t hold back. If we see something we don’t like or want to see improved, or we’re even just having a bad day, we will let a site have it. W

... keep reading on reddit ➡

👍︎ 2
📰︎ r/u_freespinsbonus
👤︎ u/freespinsbonus
📅︎ Oct 29 2020
🚨︎ report
Ping failing between ER-4 and SRX320 despite correct OSPF routes showing in RIB

I have multiarea OSPF setup between an ER-4 and a SRX320 and inter-area routes are showing up, but I can't ping any of those inter-area subnets from the ER-4. I can ping from the SRX subnets to the ER-4, but pinging to the SRX subnets from behind the ER-4 fails. OSPF seems to be setup properly, the interface connections all seem to be fine, the correct routes are being added to the route table.

ER-4 Config:

firewall {
     all-ping enable
     broadcast-ping disable
     ipv6-name WANv6_IN {
         default-action drop
         description "WAN inbound traffic forwarded to LAN"
         rule 10 {
             action accept
             description "Allow established/related sessions"
             state {
                 established enable
                 related enable
         rule 20 {
             action drop
             description "Drop invalid state"
             state {
                 invalid enable
     ipv6-name WANv6_LOCAL {
         default-action drop
         description "WAN inbound traffic to the router"
         rule 10 {
             action accept
             description "Allow established/related sessions"
             state {
                 established enable
                 related enable
         rule 20 {
             action drop
             description "Drop invalid state"
             state {
                 invalid enable
         rule 30 {
             action accept
             description "Allow IPv6 icmp"
             protocol ipv6-icmp
         rule 40 {
             action accept
             description "allow dhcpv6"
             destination {
                 port 546
             protocol udp
             source {
                 port 547
     ipv6-name lanv6 {
         default-action accept
     ipv6-name localv6 {
         default-action accept
     ipv6-name wanv6_lan {
         default-action drop
         rule 10 {
             action accept
... keep reading on reddit ➡

👍︎ 9
📰︎ r/networking
👤︎ u/bagostini
📅︎ Mar 27 2020
🚨︎ report
API server stopped responding with header 'www-authenticate: Bearer realm="reddit", error="invalid_token"' when an API client makes request with an expired auth/invalid token.

For API requests made with an expired auth token, the reddit API server no longer provides the "www-authenticate" response header (with error="invalid_token").

This change is suddently breaking my API client which has been running for years.

Here's an example (note the intentional invalid token):

$ curl -i -H "Authorization: bearer an_invalid_token" -A "API Client by [email protected]" ""

HTTP/2 401 
content-type: application/json; charset=UTF-8
x-ua-compatible: IE=edge
x-frame-options: SAMEORIGIN
x-content-type-options: nosniff
x-xss-protection: 1; mode=block
access-control-allow-origin: *
access-control-expose-headers: X-Moose
cache-control: max-age=0, must-revalidate
x-moose: majestic
accept-ranges: bytes
date: Mon, 06 Apr 2020 23:59:57 GMT
via: 1.1 varnish
x-served-by: cache-ams21063-AMS
x-cache: MISS
x-cache-hits: 0
x-timer: S1586217598.786444,VS0,VE131
set-cookie: loid=[redacted];; Max-Age=63071999; Path=/; expires=Wed, 06-Apr-2022 23:59:57 GMT; secure; SameSite=None; Secure
set-cookie: session_tracker=[redacted];; Max-Age=7199; Path=/; expires=Tue, 07-Apr-2020 01:59:57 GMT; secure; SameSite=None; Secure
set-cookie: csv=1; Max-Age=63072000;; Path=/; Secure; SameSite=None
set-cookie: edgebucket=GLRllpOJdEbc5YAHoB;; Max-Age=63071999; Path=/;  secure
strict-transport-security: max-age=15552000; includeSubDomains; preload
server: snooserv
content-length: 41

{"message": "Unauthorized", "error": 401}

The expected response header—'www-authenticate'—is missing:

'www-authenticate: Bearer realm="reddit", error="invalid_token"'

According to RFC 6750 (The OAuth 2.0 Authorization Framework: Bearer Token Usage) Section 3, it states:

... If the protected resource request included an access token and failed
authentication, the resource server SHOULD include the "error"
attribute to provide the client with the reason why the access
request was declined. ...

Further, the RFC gives an example of what the response should look like in the event when a request is made with an expired access token:

... in response to a protected resource request with an authentication attempt using an expired access token:

     HTTP/1.1 401 Unauthorized
     WWW-Authenticate: B
... keep reading on reddit ➡

👍︎ 12
📰︎ r/redditdev
👤︎ u/buddydvd
📅︎ Apr 07 2020
🚨︎ report
My journey to 30 and beyond, a beta story (lots of screenies)

My journey to 30 and beyond, a beta story

TL;DR Below and huge [IMAGE GALLERY]

Greetings and salutations. This is my very lucky beta journey to 30 on my Orc Warrior, including visual aids to guide you along the way. I did a good variety of questing, grinding, professions, dungeoning, and just hanging out. I will include screenshots, anecdotes, item-links, and the overall experience; this might get a bit long winded but I figured someone might enjoy my story or images. I'll even throw in a couple of useful tips and tricks you may not be familiar with.


I was very lucky and got in the WoW Classic beta late May 23rd/early May 24th at random, a day or two after the first stress test (which I also had access to and a blast playing in). I played Vanilla from release through to TBC (on Boulderfist) back in the day and came back to WoD very briefly and again for retail BFA. Back in Vanilla I cleared ZG, MC, Ony, BWL, and AQ20 playing mostly Arms (yep I pvp'd a lot) and also did some main tanking (in prot spec of course). I was ecstatic I was able to recreate my old Orc Warrior, especially since I was not in the early beta waves and had mostly given up on getting a key for Classic. A few days prior to getting in the beta, I was t-boned by a driver that ran a red light, totaled my vehicle, and injured me a bit. Karma was being kind to me and allowed me a unique situation to play more than I usually would be able to.

Identical to my original Orc Warrior from back in the day

A couple months ago I briefly did a little early level practicing on a pserver, so after getting my interface settings, UI, and hotkeys somewhat setup, I decided to blast my way through the earlier portions of the game. Right away I noticed how helpful and amiable the community was. For example, if you ran up on a quest mob that you needed but it was already grey and tagged by another player, they would immediately invite you to their group to get you credit for the kill. In addition, people were drive by buffing everyo

... keep reading on reddit ➡

👍︎ 208
📰︎ r/classicwow
👤︎ u/Watsonator
📅︎ Jun 03 2019
🚨︎ report
Any Node.js package that can parse a cookie that have multiple values on the same name?

Looks like the 3 top cookie libraries are not able to parse a cookie header value that contains duplicate names

How are you dealing with cookies having multiple identical names? E.g. when identical cookies are set from different domains.

Note that this behaviour is specified by RFC 2616

>Multiple message-header fields with the same field-name MAY be present in a message if and only if the entire field-value for that header field is defined as a comma-separated list [i.e., #(values)]. It MUST be possible to combine the multiple header fields into one "field-name: field-value" pair, without changing the semantics of the message, by appending each subsequent field-value to the first, each separated by a comma. The order in which header fields with the same field-name are received is therefore significant to the interpretation of the combined field value, and thus a proxy MUST NOT change the order of these field values when a message is forwarded.

👍︎ 7
📰︎ r/node
👤︎ u/yvele
📅︎ Feb 25 2020
🚨︎ report
Fast HTTP package for Go 😱…
👍︎ 42
📰︎ r/golang
👤︎ u/schumacherfm
📅︎ Nov 23 2015
🚨︎ report
Linux eager to fail TCP Fast Open

I was trying to test a few servers for TCP Fast Open support. After some failed attempts, my client stopped sending fast open cookie request entirely, and it wouldn't send them until I restarted. After some looking around, I found that Linux disables client-side TCP Fast Open globally, for an hour, then 2,4,8 hours, if it encounters 3 consecutive timeouts. I only learned this because I looked at the kernel source code:

I think this is very significant. Is this documented anywhere? I don't see it mentioned in the RFC:

at least not globally on the client side. Do people know about this? Any reference would be helpful. Thank you!

👍︎ 5
📰︎ r/networking
👤︎ u/shiteater4
📅︎ Oct 15 2019
🚨︎ report
"There's only one thing to do, and that's consult the RITA."

As I said in yesterday's story, a while back I was working on a project for a large retail company. They had recently split from their parent company and were in the process of building out their own international network. I was brought in towards the tail end of the project because someone had done the math and realized that there was a good possibility they weren't going to make the cutoff date to be off the current network if something wasn't done, so I got the call.

The gig was pretty straight forward. Field crews would go into the store after closing, install the router, switch, and phones. Sometime between 8pm and 8am, they would call us and we would get everything provisioned, configured, tested, and ready for the store to open the next day. This would be my 4th time doing this exact type of thing, so it was old hat for me. I was just coming off a 6 month break from working due to medical issues and this was going to be something relatively easy and relaxing to get me back into fighting condition.

It took me about 4 days to get back into the swing of things, but by day 5 I already had everything I needed to do scripted out. The configs were cookie-cutter, so all you had to do was change the hostname, IP addresses, SNMP info, and DHCP info. I sat around and created files that contained all this info for the 500+ stores while waiting for the field techs to call. When the call would come in, they would give me the store number and I would make the folks in proud by using cat the way the way God intended and spit out my config scripts based on those different files I had created earlier. I once received a UUOC award back before some of you were born, and I wasn't about to get another one.

The stores were all connected back home via MPLS, but we also had 3G failover in each of the stores should the MPLS link drop. The 3G was running over DMVPN back to the home office, and at this time my entire experience with DMVPN was in the form of 2 pages in the 2007 CCIE R&S Written Exam V3.0 study guide. (Due to a combination of alcohol induced amnesia the night of the test and the Cisco Certification and Confidentiality Agreement, I don't remember if it was actually on the test or not.)

A couple months have passed; things were going smoothly and we're now on track to finish not just on time, but actu

... keep reading on reddit ➡

👍︎ 536
👤︎ u/KiltedCajun
📅︎ Sep 07 2014
🚨︎ report
DAQ (-1) error followed by random symbols- Windows

Hello, I am hoping to find any solutions to my problem. I have installed snort 2.9.13 in virtualized Windows Server. I have succesfully tested out the config file and there seems to be no error on it. I have made also some test rules to test out the functionalities of IDS. However, whenever I try to run snort with "-A console" it comes out with a DAQ (-1) error and random symbols following it. Does anybody have any clue to what is happening?

This is the command I am entering

This is the error I am getting

The random symbols changes everytime I run snort

and this is my conf file:

#   VRT Rule Packages Snort.conf
#   For more information visit us at:
#                   Snort Website
#    Sourcefire VRT Blog
#     Mailing list Contact:      [email protected]
#     False Positive reports:    [email protected]
#     Snort bugs:                [email protected]
#     Compatible with Snort Versions:
#     VERSIONS : 2.9.13
#     Snort build options:
#     OPTIONS : --enable-gre --enable-mpls --enable-targetbased --enable-ppm --enable-perfprofiling --enable-zlib --enable-active-response --enable-normalizer --enable-reload --enable-react --enable-flexresp3
#     Additional information:
#     This configuration file enables active response, to run snort in
#     test mode -T you are required to supply an interface -i &lt;interface&gt;
#     or test mode will fail to fully validate the configuration and
#     exit with a FATAL error

# This file contains a sample snort configuration. 
# You should take the following steps to create your own custom configuration:
#  1) Set the network variables.
#  2) Configure the decoder
#  3) Configure the base detection engine
#  4) Configure dynamic loaded libraries
#  5) Configure preprocessors
#  6) Configure output plugins
#  7) Customize your rule set
#  8) Customize preprocessor 
... keep reading on reddit ➡

👍︎ 2
📰︎ r/snort
📅︎ Apr 23 2019
🚨︎ report
Httpwebrequest: saved cookies from login, passed it to next request, still no response in Fiddler

Apologies about the last post. I've been working on this for days and am starting to go crazy.

I am trying to login to website in C#. I have to do it with HttpWebRequest and HttpWebResponse because I eventually need to overwrite a method in some software called Abot that passes my login cookies to their HttpWebRequest. Then I'll be able to crawl through the site, and eventually scrape it.

It's been days, and I am really struggling with how to do this. Turns out every single website needs a to be custom tailored to logging in this way. I've manually logged in with Fiddler to observe the traffic data and have tried my best to recreate what was done manually in this program. I am able to almost replicate the two main POSTs that get done, but I don't receive the GET response I need and I have no idea why.

From what I've gathered, I need to get login cookies from a login attempt, and then pass those cookies to every subsequent request from there on after. I have done my best to do exactly that,

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Text;
using System.Threading.Tasks;
namespace super_login_attempt
   class Program
       static void Main(string[] args)
           HttpWebRequest requestWithLoginCookies = (HttpWebRequest)WebRequest.Create("https://loginurl");
           CookieCollection loginCookies = new CookieCollection();
           requestWithLoginCookies.CookieContainer = new CookieContainer();
           requestWithLoginCookies.Method = "POST";

           //set request headers
           requestWithLoginCookies.Accept = "application/json, text/javascript, */*; q=0.01";
           requestWithLoginCookies.Headers.Add("Origin", "https://baseurl");
           requestWithLoginCookies.Headers.Add("X-Requested-With", "XMLHttpRequest");
           requestWithLoginCookies.Headers.Add("X-SC-Touchpoint", "checkout");
           requestWithLoginCookies.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36";
           requestWithLoginCookies.ContentType = "application/json; charset=UTF-8";
           requestWithLoginCookies.Referer = "https://differentloginurl";
           requestWithLoginCookies.Headers.Set(HttpRequestHeader.AcceptEncoding, "gzip, deflate, br");
... keep reading on reddit ➡

👍︎ 2
📰︎ r/csharp
👤︎ u/OliveSweatshirt
📅︎ Jul 28 2018
🚨︎ report
Application Load Balancer uses non-standard "Expires" header, classic ELB uses standard "max-age" for session stickiness. Why?

I have a client application that uses Apache HTTP Client v4.5.2, and cannot modify its code. It needs to use session stickiness to the backend, and we've switched from using the classic ELB to the new application ELB in order to use the WAF.

Apache HTTP Client supports cookies and works with the ELB, which sends an AWSELB cookie:


However, when we use the ALB, it uses an AWSALB cookie:

AWSALB=QclQ[...]V2kP; Expires=Tue, 27 Dec 2016 09:31:43 GMT; Path=/

As the client app uses Apache HTTP Client, its code would need to be changed to support the non-standard Expires header, which isn't really an issue.

Can anyone think of a good reason why AWSALB is working differently to AWSELB, or potential workarounds that don't involve changing the code?

EDIT: Correction - as of RFC 6265 both are valid. That'll help the argument the change should be made in the client, but whether or not it will be - and rolled out quickly enough - means a plan B would be helpful :-)

👍︎ 16
📰︎ r/aws
👤︎ u/Jaffa2
📅︎ Jan 20 2017
🚨︎ report
How I compiled a list of all free plans: revisited.

Hello everybody,

if someone cares to remember, around 2 years ago I posted a one-liner to allow automated downloading of plans from without extra hassle.

What a brilliant time it was! All of them were available for free, and they only wanted your email in return.

Today I returned to the site to see if something new turned out and found that it had, indeed, and not of the pleasant variety: now the plans from seasons 1 to 6 are only available for purchase, but 7 to 9 are up for grabs (for now) — where there used to be 150+ files, there's but half-a-hundred now. Also the old one-liner didn't work. Well, not completely: the downloaded list is useless.

Apparently, they decided to somewhat tighten their defenses and ask for your email really persistently. But in vain! The whole cookie-tossing they now have really boils down to asking your email in a web form. I made a new version of my one-liner (as the last time, it's for *NIX OSes and I guess generic enough). Here it is:

wget -qO- '' | \
	grep -o -e '/download/[0-9]\+/.\+\.pdf' | sort -u -t '/' -k 4 | \
	sed 's/^/http:\/\/' | \
	xargs -n 1 -I %u wget "%u" --header "Cookie: wsshop-email=\"[email protected]\""

As per RFC 2606, I used domain, you may replace it with whatever you fancy, including your actual email address.

Alternatively, if you remove the last part of the line, it'll just churn out a list of files like this (in fact, so does the old one, but this one is better at removing duplicate entries), so you can use it to form a list of URLs, which you can then feed to wget like so:

wget -i YourListFile.Here --header "Cookie: wsshop-email=\"[email protected]\""

I have no idea how to set cookies in other download managers, but should be doable, I guess.

Finally, by putting a number (7, 8 or 9) in front of [0-9] in the one-liner above, you can select files only from that particular season.

Have fun!

👍︎ 20
📰︎ r/woodworking
👤︎ u/h-v-smacker
📅︎ Jul 20 2016
🚨︎ report
Should I make a write up of my experiences writing a cookie/session auth system with Go and martini?

Last night I finally finished my auth.go file after many nights of struggling. Throughtout the journey I learned a lot about Go, Martini, Injection, HTTP, Cookies, and way way more.

Basically I wondered if anyone thinks it would be valuable to describe my journey and walk through the code?

Some details:

  • Used postgres backend (and actually test against it, using my own init scripts)
  • Heavily used TDD for end-to-end auth (and enjoyed it!!)
  • Auth works with request params and with cookies (I read the OAuth2 RFC and loosely based it around that)
  • Used SHA2 + Salt for password hashing
  • Used injection for managers and dbcontrollers
  • ~700 lines from find . -name *.go | wc -l

There's more details but have to run!

EDIT: unrefined source at

👍︎ 14
📰︎ r/golang
👤︎ u/BookPage
📅︎ Feb 11 2014
🚨︎ report
PreSeason 5.24 Tristana Guide

I was starting to answer /u/Torem_Kamina comment here when it blew out of proportions.

Right know, I think there is no longer such thing as a core/unique item path/build. Adaption is the master word.

18/12/0 masteries Most of the time you will go like this, however if your (exotic) support doesn't have any really reliable CC or you are really confident you will get kills you can swap the Oppressor with the other mastery. If you are confident in your last hitting you can take the out of combat MS. I made tests, even though Natural Talent is better on Tristana as on most other ADCs, so it is for Vampirism too. The amount you will leach from your E (and passive), really adds-up. After the last patch I was really worried that Fervor of Battle would be a lot worse, however between the 1s longer, your bigger than average range, spells and E passive that applies two stacks it is still the flatout best option. Be it for laning phase, or later stages of the game. You are one of the ADC stacking the mastery, and having the easiest time to keep it stack.

Regarding runes I think AS quint is mandatory. AD reds are great, Armor yellow, I run 6 flat MR blues and 3 AS blues on my standard page to be more comfortable if I have to LH under turret. I am curious to know if anyone else is doing something else. Previously I ran a couple of MP5 runes, I don't feel the urge now.

I'm running exhaust/flash most of the time. Between the new support masteries and my style I prefer the support to take ignite/heal.

As an ADC you know that there are 3 items you will want to complete at some point in the game:

  • A zeal upgrade
  • IE
  • Boots

All the rest is optionnal.

Then regarding the build order it depends.

  • Basically, if you have to do an early back, just take a second Doran.
  • Don't wait 1300 golds to back if you feel like there is the slightest risk. Just get your pick axe+Cookies.
  • However once you have pick axe I'd advise to do BF before any Zeal parts.
  • Then do Zeal. And complete your zeal item. If you have to chose between a crit glove or dagger, take the dagger.
  • Try to avoid to buy Kircheis shard, I find the item terrible, I'd rat
... keep reading on reddit ➡

👍︎ 10
📰︎ r/MarksmanMains
👤︎ u/Sobou_
📅︎ Dec 10 2015
🚨︎ report
OAuth2 時代の reddit でスクリプトがこの先生きのこるには

OAuth2 は、Web サービスのユーザとしての権限を、ユーザ名とパスワードを教えることなく第三者のアプリケーションに与えるための仕組みです。権限は Web サービス側で小分けにされており、第三者はアプリケーションに必要なだけの権限を Web サービスのユーザに要求することができます。

reddit では去年の 2 月に API クライアントは OAuth2 の使用が MUST になり、クッキー認証は deprecated になりました。OAuth2 対応のための猶予期間が設けられもしましたが、それも今年の 3 月 17 日で終わります:

そこでスクリプト(この記事では自分で書いて自分だけが使うような個人用スクリプトのことを指します)のユーザは OAuth2 に対応させようと四苦八苦してるわけですが、スクリプトの場合は主にふたつの対応方法があります:

  • OAuth2 Quick Start Example - resource owner password credentials grant(以下パスワードグラント)による対応
  • PRAW and OAuth - authorization code grant(以下コードグラント)による対応。コードグラントはサーバを立てなくても取得できるので、ページ最後の Flask によるサーバ例は必要なければ無視して構わない

セキュリティ的に望ましいのはコードグラントによる方法ですが、パスワードグラントによる方法もお手軽で捨てがたいです。結局のところ、どちらを選ぶかはスクリプトの内容や環境に合わせて自分で決めるしかありません。判断材料として reddit の(認可サーバの)現行の仕様を挙げておきます:

  • コードグラントの有効期間は 10 分
  • アクセストークンの有効期間は 1 時間
  • パスワードグラントの場合、リフレッシュトークンは発行されない
  • リフレッシュトークンは使っても消費されない(一度取得すればずっと使える)

PRAW ユーザは praw.ini の oauth_* の使用も検討してみてください。


  • - reddit の OAuth2 に関する公式文書。RFC 6749 の 1 章などで OAuth2 の概要を掴んでから読むのがおすすめ
  • RFC 6749, 6750, 6819 - OAuth2 仕様、bearer token、OAuth2 のセキュリティ
  • r2/controllers/, r2/models/ - 読みやすいコード。バリデータについては以前の記事が少しは役に立つかも
  • - reddit のスコープ一覧
👍︎ 9
📰︎ r/p18s
👤︎ u/nmtake
📅︎ Jan 27 2016
🚨︎ report
A bug of anaconda3.4?or may be a bug of python3.4?

As I mentioned in,A strange problem I have met.

I debuged into the code,and find something interesting.

the request get header right,but when parse the header,it goes wrong.

if the field of head contains blank beside the ':',than ,it will cause problem. I mean,if the header contains like:
Cache-Control : No-cache ,there are a blank left of the ':' and a blank right of the ':',It will go wrong.

in file:anaconda3.4/lib/python3.5/email/,line 227: if not headerRE.match(line): # If we saw the RFC defined header/body separator # (i.e. newline), just throw it away. Otherwise the line is # part of the body so push it back. if not NLCRE.match(line): defect = errors.MissingHeaderBodySeparatorDefect() self.policy.handle_defect(self._cur, defect) self._input.unreadline(line) break

and the define of headerRE = re.compile(r'^(From |[\041-\071\073-\176]*:|[\t ])')

so ,when something contains blank beside ':',it will goes wrong.

when I modified the,change headerRE = re.compile(r'^(From |[\041-\071\073-\176]\s:\s*|[\t ])') everything works fine.

I think it's a bug of python,because firefox can get cookie right from this kind of format,and the definition of http headers never say that there should be no blank beside ':'.

btw:sorry for my poor English.

👍︎ 5
📰︎ r/learnpython
👤︎ u/limw
📅︎ Jan 17 2017
🚨︎ report
PHP Strict Sessions

There is a current vulnerability in PHP < 5.5.2, but reading through the rfc and other public bug reports (I don't have access to the PHP one), there seems to be a userland workaround of using a validation key in the Session. Can someone explain this vulnerability and the workaround? I am not seeing how the vulnerability can be fixed with the workaround, since having the session id in the cookie will start the correct session (so they have access to the validation key). Am I missing some simple setting or something else?


👍︎ 7
📰︎ r/PHP
👤︎ u/oracle1124
📅︎ Feb 19 2014
🚨︎ report
My take on Rapid Firecannon Twisted Fate

Since the idea of getting RFC on TF is getting more and more popular, i've tried it out and succeeded with my way of playing it. The item's good on TF because it allows him to have a 675 range point-n-click 2 second stun. I turned the idea into a very strong hybrid build basing off the current popular way of playing tf, resolving arround rushing CDR boots with homeguard.
For runes, i run:
-Dual Penetration marks (i consider them standard on all AP carries but Vlad and Karthus
-Scaling HP seals
-Scaling CDR glyphs
-AP quints (MS are good too but recently I'm liking AP because i will have a lot of speed from items anyways)

For masteries, I run 18/12, tho 12/18 would be just as good.
What's important about those masteries is :
-Meditation for insane mana sustain to allow constant roaming
-Natural Talent for hybrid scaling
-Savagery to improve early pushing
-Feast for lane sustain
-Secret stash for MORE LANE SUSTAIN WOO

For summoners, Flash TP for maximum map presence

Skill order is standard, Q>W>E, but i like to alter the earlygame route. I often take 2 points in W early and like to get E at level 2 (sometimes even WEWQ, tho usually WEQQ or WQWE)

I usually start with Dorans Ring and 2 cookies.
My earlygame goal is to push the lane as much as i can. Open with hitting the first caster minion wave with a red card, then push the lane fast with basic attacks (maybye sneak a blue card somewhere) and back off as soon as the wave is dead.
Rinse and repeat till level 6.
Once I hit level 6, i instantly tp to a lane. (i start planning arround level 5, and tell my team to be ready. the moment i hit 6, im instantly tping)

Earlygame core is 2x Dorans Ring - coupled with Meditation it means infinite mana, Ionian Boots with Homeguard (distortion once homeguard gets removed) for roams and Sheen.
This will yield 25% CDR (30% with utility masteries) at level 6, good damage and sustain.
(or, what's more realistic - arround 27,5% CDR at level 9)

The next item is the unfamous Firecannon. It works in a similar fashin to lichbane - grants bonus burst damage and movement speed.
But it also sets ground for an AD build and grants some early autoattack damage, and whats the most important - allows me to stun people from 675 units, which is ridiculus.
The next item purchase is Trinity Force. This gives a total of 13%(!) movement speed and 50% crit.
The Tforce passive essentially makes my

... keep reading on reddit ➡

👍︎ 6
📰︎ r/summonerschool
👤︎ u/CentaurHecarim
📅︎ Nov 14 2015
🚨︎ report
Browser fingerprint

Basic overview- Introduction

lets start from browser fingerprint which is exactly how you get declines! so what is browser finger print ? anything from browser can help store / shop / payment processor to detect you are same person who tried to card them.

for understanding what they exactly can use against you simply head to

you will see different fields here .

user agent : which will exactly tell them your browser / OS / and exact version of them HTTP_ACCEPT Headers : the headers your browser accept just read HTTP RFC to understand more Browser Plugin Details : plugins loaded on your browser which can be your player / reader / anything can be loaded inside browser Time Zone : your current system time zone Screen Size and Color Depth : very interesting one will detect your screen size from size of your browser and that's why you should never change your tor window size System Fonts : fonts you've installed they detect it by flash plugin Are Cookies Enabled : does your browser accept cookie ? cookies can be very unique identification and you could've detected with this very easy Limited supercookie test : far more than cookies DOM and local storage in simple word more permanent cookies

now head to and click on extended version , again you will see same things with some more fields like DNS, ip range , JAVA, JavaScript, WebRTC, Flash, ActiveX and again screen , language and time zone

as you may don't know all of these can be used as an anti-fraud measurement but how ? every single plugin like flash and java / java-script will lead more programmatically control against clients it means stores can use these tools to detect your real location/IP/DNS which can easily detect you as a carder and decline the order .

so can we disable these plugins and we are done ? NO ! if you disable anything its again very suspicion because you don't like normal customer. so what is the solution ? the solution is properly spoof your identification to look like real card owner. we always hear stupid and funny things like card-able store not card-able store . there is no such a thing exist we surprised how many people here without any knowledge spread wrong information ! if the website is a online store which accept credit cards it means it can be carded as well if you can't which only means you didn't spoof anything correctly even one mistake in all of anti-fraud measurement can lead to order decline and

... keep reading on reddit ➡

👍︎ 2
📰︎ r/WarandPeace
👤︎ u/Baccilar
📅︎ Dec 11 2015
🚨︎ report
Feature request: second factor or one-time password support

Obligatory note: I think I have seen this idea proposed before, but a quick search did not turn up anything.

The idea is simple: allow reddit users to optionally activate a one-time password device for their account. Then, detect when they are logging in from a new machine (either by setting a machine-level cookie or by some fingerprinting), and require a one-time password the first time they log in from that machine.

Google started doing this recently with their 2-step verification. The really nifty thing is that they open-sourced the Android and BlackBerry mobile clients, and also made the iOS version available for free (iTunes link).

The Google clients are based on the open HOTP/RFC-4226 and draft TOTP standards. This means that the Google clients are already available and ready to be used, but any other implementation of the standards could also be used. Redditors don't necessarily need to own a smart phone or use Google's software to take advantage.

The down sides are that it will take time to implement, and it is sort of painful to generate a one-time password. By setting a cookie at the machine-level, though, the one-time password could only by required, say, for new computers, or every 6-12 weeks for a known computer.

Edit fixed links, enhanced wording.

👍︎ 2
👤︎ u/sgndave
📅︎ Dec 19 2010
🚨︎ report
Snort startup fail

Hello there,

I build snort3 from source and it seems not to work.

$snort -V

,,_     -*&gt; Snort++ &lt;*-
o"  )~   Version 3.0.0-a4 (Build 228) from 2.9.8-383
''''    By Martin Roesch &amp; The Snort Team
       Copyright (C) 2014-2016 Cisco and/or its affiliates. All rights reserved.
       Copyright (C) 1998-2013 Sourcefire, Inc., et al.
       Using DAQ version 2.2.1
       Using libpcap version 1.8.1
       Using LuaJIT version 2.0.3
       Using PCRE version 8.35 2014-04-04
       Using ZLIB version 1.2.8
       Using OpenSSL 1.0.1t  3 May 2016

But if I try to start it this happens:

$snort -c /etc/snort/snort1.conf
 o")~   Snort++ 3.0.0-a4-228
 Loading /etc/snort/snort1.conf:
 FATAL: can't load /etc/snort/snort1.conf: /etc/snort/snort1.conf:1: '=' expected near 'HOME_NET'
 Fatal Error, Quitting..

this is my config file

ipvar HOME_NET,
ipvar EXTERNAL_NET any
portvar HTTP_PORTS [80,81,311,383,591,593,901,1220,1414,1741,1830,2301,2381,2809,3037,3128,3702,4343,4848,5250,6988,7000,7001,7144,7145,7510,7777,7779,8000,8008,8014,8028,8080,8085,8088,8090,8118,8123,8180,8181,8243,8280,8300,8800,8888,8899,9000,9060,9080,9090,9091,9443,9999,11371,34443,34444,41080,50002,55555]
portvar ORACLE_PORTS 1024:
portvar SSH_PORTS 22
portvar FTP_PORTS [21,2100,3535]
portvar SIP_PORTS [5060,5061,5600]
portvar FILE_DATA_PORTS [$HTTP_PORTS,110,143]
portvar GTP_PORTS [2123,2152,3386]
ipvar AIM_SERVERS [,,,,,,,,,,,]
var RULE_PATH /etc/snort/rules
var SO_RULE_PATH /etc/snort/rules/so_rules
var PREPROC_RULE_PATH /etc/snort/rules/preproc_rules
var WHITE_LIST_PATH /etc/snort/rules/iplists
var BLACK_LIST_PATH /etc/snort/rules/iplists
config disable_decode_alerts
config disable_tcpopt_experimental_alerts
... keep reading on reddit ➡

👍︎ 2
📰︎ r/snort
👤︎ u/Kr4ut
📅︎ Mar 29 2017
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.