[OC] Animated Heatmap of Parler Video GPS Metadata in DC on January 6th 2021 v.redd.it/qkkqj6nl0ya61
πŸ‘︎ 7k
πŸ“°︎ r/dataisbeautiful
πŸ’¬︎
πŸ‘€︎ u/_Xeet_
πŸ“…︎ Jan 12 2021
🚨︎ report
NOSSAFLEX app has been officially released on the App Store! Meter, log and export your film photo metadata quickly and efficiently! Read more about our features, privacy and future development in the comments section! reddit.com/gallery/lgvk05
πŸ‘︎ 653
πŸ“°︎ r/AnalogCommunity
πŸ’¬︎
πŸ“…︎ Feb 10 2021
🚨︎ report
I made a site that automatically edits the metadata of anime openings and downloads it

animeta.co

I got sick and tired of constantly adding album art and artist name to the openings I downloaded so I decided to make a site that does it for me. It gets images from duckduckgo image search so it might not be the perfect image, but it seems to be alright for popular songs.

I hope it's helpful to at least one person other than me. Have a nice day!

If you have any feedback please let me know in comments.

πŸ‘︎ 651
πŸ“°︎ r/animepiracy
πŸ’¬︎
πŸ‘€︎ u/sodiumkid
πŸ“…︎ Feb 13 2021
🚨︎ report
LPT: If you have a DSLR camera, take note of the serial number in case it is stolen. Many cameras include the serial number in the metadata of the photos it takes. Tools exist to trace any photos taken with your camera once they are posted online.

One such service is CameraTrace. Under their FAQ is a partial list of supported camera models. Currently they support tracing on about 5 popular photo hosting sites including twitter.

Other similar services probably exist or are likely to in the future. As well as the expansion of the range of sites they can search from.

πŸ‘︎ 29k
πŸ“°︎ r/LifeProTips
πŸ’¬︎
πŸ‘€︎ u/free-dadjokes
πŸ“…︎ Dec 15 2020
🚨︎ report
I love that despite almost everything else internet dependant on old software being broken, iTunes will still grab CD metadata from CDDB and rip away like it's still the early 2000s.
πŸ‘︎ 286
πŸ“°︎ r/ipod
πŸ’¬︎
πŸ‘€︎ u/cosmogfd
πŸ“…︎ Feb 19 2021
🚨︎ report
Microsoft unifies all Windows APIs under a single Rust library generated from metadata kennykerr.ca/2021/01/21/r…
πŸ‘︎ 430
πŸ“°︎ r/programming
πŸ’¬︎
πŸ‘€︎ u/Karma_Policer
πŸ“…︎ Jan 21 2021
🚨︎ report
Metadata e timing sugere que o Benfica renovou com vÑrios jovens em Janeiro mas só anuncia cada renovação depois de um mau resultado reddit.com/gallery/lnjkk6
πŸ‘︎ 117
πŸ“°︎ r/benfica
πŸ’¬︎
πŸ‘€︎ u/kokeboka
πŸ“…︎ Feb 19 2021
🚨︎ report
Rust for Windows Bindings: Generating the Entire Windows API Surface from Metadata kennykerr.ca/2021/01/21/r…
πŸ‘︎ 444
πŸ“°︎ r/rust
πŸ’¬︎
πŸ‘€︎ u/itchyankles
πŸ“…︎ Jan 21 2021
🚨︎ report
Parler Video GPS metadata timelapse from storming of the US Capitol twitter.com/savaki/status…
πŸ‘︎ 767
πŸ“°︎ r/dataisbeautiful
πŸ’¬︎
πŸ‘€︎ u/UnderwearNinja
πŸ“…︎ Jan 14 2021
🚨︎ report
why did no one hide their metadata? location? face?

why did no one hide their metadata? location? face?

it seems like everything possible to get caught was done during this coup. Was it just incompetence or what?

πŸ‘︎ 30
πŸ“°︎ r/ParlerWatch
πŸ’¬︎
πŸ‘€︎ u/justreddit247
πŸ“…︎ Feb 13 2021
🚨︎ report
Hi guys, here's my program to standardize video files (MP4, MKV, M4V) by removing unwanted metadata, setting default tracks and more. (Windows/Linux)

Hi, after 2 months of development I'd like to show off my program, StandardFormatTranscoder:

https://github.com/jacksalssome/StandardFormatTranscoder

It was originally created to make all my Anime have default English subtitles and Japanese audio, as selecting it was a pain. Now it can also do TV show and Films.

So now all your "ISO's" can have the same defaults and standard track names

Its a commandline program with the following arguments:
[--overwrite]
[-i INPUT]
[-o OUTPUT]
[--engAudioNoSubs] Removes all subtitles if it finds English audio and no Japanese audio
[--DryRun] Preview changes

Features:

  • Basic file renaming [--rename], use filebot if you want something better lol

  • Recursively transcode files [-r]

  • Uses Ffmpeg as its backend

  • Keeps codecs, give it an MKV, MP3 audio with HDMV-PGS subs and it will output the same

  • Helpful error messages

  • Color

  • track renaming, so audio tracks get named for example: "English (2.0)", "Japanese (7.1)", "Commandry", "Full Subtitles", "English (2.0) (CC)"

  • Can remove all non English and Japanese audio if there's an English audio track found.

  • Tests file integrity (uses Ffmpeg's built in check)

  • Removes: attachment's, MPEG video and cover art

  • Commentary detection

  • Try's to find the best sub if there's more there's 2 more or English subtitles

  • Song/signs detection, wont make them default if found.

  • English subs with: "dialogue", "full subs", "full subtitle", "[full]", "(full)" or "modified" in the title will be the new Default

  • And More

On windows I usually use:

StandardFormatTranscoder.exe -r --rename --engAudioNoSubs -i "D:\Test" -o "D:\Output Test"

For a preview of changes:

StandardFormatTranscoder.exe -r --rename --engAudioNoSubs -i "D:\Test" -o "D:\Output Test"

Please read the GitHub page for install instructions and more info :)

πŸ‘︎ 40
πŸ“°︎ r/DataHoarder
πŸ’¬︎
πŸ‘€︎ u/jacksalssome
πŸ“…︎ Feb 21 2021
🚨︎ report
[OC] ytmdl - Download music with metadata from various source. Now supports downloading split chapters. v.redd.it/c7q3afduo8b61
πŸ‘︎ 990
πŸ“°︎ r/unixporn
πŸ’¬︎
πŸ‘€︎ u/Droider412
πŸ“…︎ Jan 14 2021
🚨︎ report
ytmdl - Download songs from YouTube with metadata. Now supports deezer, lastfm, saavn. v.redd.it/gn187rw3r5461
πŸ‘︎ 3k
πŸ“°︎ r/Piracy
πŸ’¬︎
πŸ‘€︎ u/Droider412
πŸ“…︎ Dec 09 2020
🚨︎ report
Any ideas why Plex isn't picking up the metadata for episodes 2-7 here? The files are named the same way as all the others, and all the info is there on TheTVDB. Not sure what to try
πŸ‘︎ 45
πŸ“°︎ r/PleX
πŸ’¬︎
πŸ‘€︎ u/kurtanglesmilk
πŸ“…︎ Feb 19 2021
🚨︎ report
Archivists Are Mining Parler Metadata to Pinpoint Crimes at the Capitol vice.com/en/article/qjpev…
πŸ‘︎ 911
πŸ“°︎ r/politics
πŸ’¬︎
πŸ‘€︎ u/djorankeil
πŸ“…︎ Jan 13 2021
🚨︎ report
cell phone metadata and geolocation will doom the coup participants.

People brought their cellphones as they committed crimes. The metadata will be recovered, linked to a geolocation inside the Capitol and linked to the owners of said phones. Should facilitate the work of the FBI.

https://www.washingtonpost.com/technology/2021/01/08/trump-mob-tech-arrests/?itid=hp-top-table-main-0106

πŸ‘︎ 608
πŸ“°︎ r/ParlerWatch
πŸ’¬︎
πŸ‘€︎ u/alubat_ovni
πŸ“…︎ Jan 08 2021
🚨︎ report
Pleasure to see that Signal attempts to remove metadata from images, and it refuses to share the image if it fails.
πŸ‘︎ 268
πŸ“°︎ r/signal
πŸ’¬︎
πŸ‘€︎ u/Bobby-Bobson
πŸ“…︎ Jan 14 2021
🚨︎ report
To anyone who got their music deleted by Distrokid - here's how to get your ISRC&UPC Codes and other metadata

Multiple people, including me, have recently been hit by a deletion of releases by distrokid without any explanation, I'm not going to comment on that any further.

If you want to republish your music you have to get as much metadata identical to the deleted release including IRSC and UPC codes.

While the releases might have been deleted you can actually find backups including metadata in the "vault" - it's found in the "more" drop down section on the top right.

Hope this helps you getting your music back online as soon as possible.

EDIT: Always backup your metadata, take screenshots of your uploads (stuff like genre/Subgenre, starting times, etc)

πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/spockadoodle
πŸ“…︎ Feb 18 2021
🚨︎ report
Why does protonVPN collect so many personal data and metadata compared to other VPN?

ProtonVPN is considered among the best VPN in terms of privacy and security according to various web site link1 link2. Moreover, protonVPN is open source and audited. However, according to apple app privacy, it collects so many personal data:

ProtonVPN: personal data (contact info, identifiers, user content), metadata (contact info, diagnostics, usage data)

Mullvad: no data and metadata

IVPN: no data and metadata

expressVPN: personal data (contact info, identifiers), metadata (diagnostics, usage data)

nordVPN: personal data (identifiers), metadata (contact info, identifiers, diagnostics, usage data)

Can you clarify this?

Thank you

P.S.

ProtonVPN is among the worst according to this comparison.

πŸ‘︎ 63
πŸ“°︎ r/ProtonVPN
πŸ’¬︎
πŸ‘€︎ u/pachainti
πŸ“…︎ Jan 26 2021
🚨︎ report
What's your opinion on : metadata, :test :pre and :post ?

From time to time, I love reading The Joy of Clojure, it is really inspiring and teaches a lot on the Clojure philosophy for getting stuff done. I still find neat tricks I forgot since my last read, and I'd love one day some appendix on features of Clojure post 1.6 like transducers and clojure.spec.

In the book, authors insist a lot on the clojure metadata facility. You can add, on most of clojure items, some metadata that does not change the item identity but gives it some satelite intel. This has obvious utility for designing clojure dev tools : for example every var have some metadata attached to it, like its source file, line and column, scope, arglist, etc.

However, I've nearly never notice some codebase exploiting metadata for domain logic. Do you sometimes use meta and with-meta in your projects ? For what purpose ?

Another question is the use of the :pre, :post and :test metadata that you can attach to a function. This allows some basic contract-base programming and testing to every clojure function. I used to see some :pre and :post a few years ago, but since spec is out do you still use it or do you rely more on spec contract facilities ?

πŸ‘︎ 16
πŸ“°︎ r/Clojure
πŸ’¬︎
πŸ‘€︎ u/charlesHD
πŸ“…︎ Feb 17 2021
🚨︎ report
ytmdl - Download songs with metadata embedded from various source. Now supports downloading chaptered songs from YouTube. v.redd.it/xcn1i0mlp8b61
πŸ‘︎ 335
πŸ“°︎ r/linux
πŸ’¬︎
πŸ‘€︎ u/Droider412
πŸ“…︎ Jan 14 2021
🚨︎ report
Parler Users Video GPS Data Shows Involvement in Capitol Attack β€” "More than 99% of all Parler posts...were saved. Unlike most of its competitors, Parler apparently had no mechanism in place to strip sensitive metadata from its users’ videos prior to posting them online." gizmodo.com/parler-users-…
πŸ‘︎ 163
πŸ“°︎ r/ConspiracyII
πŸ’¬︎
πŸ‘€︎ u/pijinglish
πŸ“…︎ Jan 12 2021
🚨︎ report
Scraped Parler data is a metadata goldmine techcrunch.com/2021/01/11…
πŸ‘︎ 214
πŸ“°︎ r/technology
πŸ’¬︎
πŸ‘€︎ u/javaxcore
πŸ“…︎ Jan 11 2021
🚨︎ report
Mass Removal/Editing of File Metadata

I've spent a fair bit of time over the past week trying to fix my content's metadata (not the Plex metadata, the actual files' metadata) so that it doesn't screw with Plex. This is for one specific reason - in order to have external subtitles properly recognised in Plex, you have to have the Local Media Assets agent not only on, but at the top of the list. At least, you do according to Plex, and from my brief and non-exhaustive testing, this seems to be the case. As I'm sure many of you know, this means that you're likely to end up with movies being displayed as The.Great.Escape.1963.RARBG.x264.IHateYourOrganisationHaHa. The obvious solution is to remove the metadata that causes this from the files. You could also edit it to actually be correct, but if that's your bag, you probably already have the infrastructure in place to do that, and this post isn't intended for you.

Anyway, I decided to put everything I've learned in one place, as there doesn't seem to be anything like this anywhere else on the internet, and this seemed as good a place as any.

The good news is, much of this can be done from within Windows itself (I imagine there's a similar method for Linux or OSX, but I don't have a Linux machine at the moment, and will likely never have a Mac). Simply go to your relevant content folder, search for *.<file extension> (e.g. *.mp4), and wait for the search to complete. Display everything in information view (in Windows 10, and I believe in every version from 7 onwards, there's a handy button at the bottom right to do that), right click on the headers, and choose "More..." Scroll through the list, and add whichever tag you're looking for - most likely Title. Then sort the list by that header, select everything with an entry, and open the properties. Go to Details, and click "Remove Properties and Personal Information". Here's where it gets fun - some filetypes allow you to do this, and some don't. You'll be able to tell the difference by the list that appears - if it has checkboxes, you can, if it doesn't, you can't. If it has them, tick the ones you want to clear, then hit OK. It may take a while, depending on how many files you're working with, but it'll handle it. You can also tell it to create a copy of each file with all possible properties removed, but I imagine that's not an option for most of us - I certainly don't have enough free space to duplicate my entire library.

For the other filetypes, there are other options. I'

... keep reading on reddit ➑

πŸ‘︎ 11
πŸ“°︎ r/PleX
πŸ’¬︎
πŸ‘€︎ u/EOverM
πŸ“…︎ Feb 12 2021
🚨︎ report
Noticed an odd request in my logs, anyone know what metadata.google.internal is?
πŸ‘︎ 53
πŸ“°︎ r/pihole
πŸ’¬︎
πŸ‘€︎ u/goose_ws
πŸ“…︎ Feb 19 2021
🚨︎ report
Need a solution to index more than 30 million files including metadata and file content on our Windows File Server of around 30 TB

Here is what I have tried and reasons why they are not working.

  1. Windows Search Indexer Server Role.
    *Limited to 1 million files https://docs.microsoft.com/en-us/troubleshoot/windows-client/shell-experience/windows-search-performance-issues#:~:text=The%20Indexer%20can%20index%20up,Outlook%20mailboxes%20on%20the%20computer
    Maybe there is a way around this limit??

  2.   X1 Rapid  
    

* Requires manually creating hundreds of "mount points" that break up all the files into under
200,000 file UNC paths which is WAY to much work to maintain.
* Too expensive - over $30k per year to maintain the license

  1.    Lokeen  
    

*Support never responded when running the trial which kept failing and never fully indexed
even close to all of our data

  1.    DtSearch
    

*Requires manually creating hundreds of indexes and does not allow for one central index for
all clients to use.

  1.     Copernic
    

*currently on trial attempting to build an index but it is running WAY too slow! Only indexing
1,000 files per hour.

Note: Before telling me I need to upgrade the hardware on the server note that I am using all Enterprise flash storage on my file servers, I am storing the indexes on Intel Optane Enterprise flash drives, my network speed is 80Gbps across servers, and all servers have loads of CPU and memory to work with. I have monitored using task manager and resource monitor and these server are no where close to being bottlenecked.

πŸ‘︎ 4
πŸ“°︎ r/sysadmin
πŸ’¬︎
πŸ‘€︎ u/erasnick
πŸ“…︎ Feb 19 2021
🚨︎ report
Feature Request: Add an β€œalbum subheading” option in Apple Music to display metadata such as deluxe, expanded, remastered, anniversary editions, etc.
πŸ‘︎ 349
πŸ“°︎ r/iOSBeta
πŸ’¬︎
πŸ‘€︎ u/sickpanda42
πŸ“…︎ Jan 20 2021
🚨︎ report
TIL Photos downloaded on a Mac have the download URL in their Metadata photoinvestigator.co/blog…
πŸ‘︎ 40
πŸ“°︎ r/osx
πŸ’¬︎
πŸ‘€︎ u/bmxice
πŸ“…︎ Feb 10 2021
🚨︎ report
How can I specify a list of keywords in MP3 metadata?

I'd like to specify a comma separated list of keywords; ideally they would be indexed and searchable later, using some kind of software. So for example, in this "keywords" field, I would enter "bird, colaptes-auratus, piciformes, picidae, male, winter, michigan, dan-gibson"

Is that even possible? I need an audio format that can be played in a web page.

Thanks for any ideas.

πŸ‘︎ 10
πŸ“°︎ r/musichoarder
πŸ’¬︎
πŸ‘€︎ u/Asker82237
πŸ“…︎ Feb 21 2021
🚨︎ report
[IOS] [β€ŽEXIF Viewer: Photo Metadata] [$0.99->Free] apps.apple.com/app/id1499…
πŸ‘︎ 58
πŸ“°︎ r/AppHookup
πŸ’¬︎
πŸ‘€︎ u/robert91818
πŸ“…︎ Jan 27 2021
🚨︎ report
Can I control which types of metadata are automatically downloaded? I ONLY want the art stuff, I don't want any of the other clutter like ratings, genres, descriptions ect

Hello,

I like letting Jellyfin automatically download art for my media. That's cool. But I DON'T want it to automatically download and display anything else.

  • I don't want it to show episode descriptions, because they often contain spoilers
  • I don't want it to show critic or fan ratings, because I don't want my opinion of media preemptively influenced by others' opinions
  • I don't want to see the parental rating or the time of day/week the show originally aired because they are just clutter that I don't care about
  • I don't want to see the "Cast & Crew" section because again it's just clutter and if I want to know that information it's much more effective to look it up on Wikipedia or IMDb
  • I don't want it to show genres, because I find they often take away some surprise from the media

Basically I want Jellyfin to be a fancy yet minimalist file browser and video file player, but one that automatically downloads and displays cover art from my series. None of the other metadata stuff. Is there a way to make it do this?

I'd honestly be happy to code the feature myself and make a PR if necessary, so if someone familiar with the codebase could point me to where the metadata downloads happen that would be cool!

πŸ‘︎ 20
πŸ“°︎ r/jellyfin
πŸ’¬︎
πŸ‘€︎ u/ShutUpAboutGenres
πŸ“…︎ Feb 13 2021
🚨︎ report
How to get more experience working with metadata

Hi everyone,

I currently work in Access Services at an academic library. I had a previous job working in archives (a 3-month contract position) where I got to work with metadata and digital collections and really enjoyed it. I want to make a job change eventually from Access to working more with digital respositories but feel like I need more hands-on experience. Does anyone know if there's any kind of volunteer work out there for working with metadata or what kinds of organizations I could join to get more involved with managing digital respositories? I would really just appreciate any advice on how to get more involved with metadata work. Right now, my job isn't that flexible for me to work with it at my current institution.

Thanks!

πŸ‘︎ 41
πŸ“°︎ r/librarians
πŸ’¬︎
πŸ‘€︎ u/strikhedonian
πŸ“…︎ Feb 12 2021
🚨︎ report
Cuss words in metadata?

I'm creating metadata for a photoset of political protests that occurred in the city where I work a few years ago. The photos have no descriptions, so I'm creating descriptions just in case the files get mixed up or corrupted.

Several of the photos feature protestors carrying signs that say "F*ck I.C.E." or "F*ck White Supremacy" (bleeped out because idk the rules on this subreddit).

My question is: what are the best practices for including what is considered "offensive" language into descriptions? I'd like to include it, as it's literally what's on the signs, but didn't know if that was typically done or not. Any advice would be much appreciated.

πŸ‘︎ 28
πŸ“°︎ r/Archivists
πŸ’¬︎
πŸ‘€︎ u/florallibrarian
πŸ“…︎ Feb 19 2021
🚨︎ report
Native asset transactions fees are to be paid in ADA only! Currently no plans to pay fees in native asset tokens. This will create high demand for ADA! According to IOG metadata workshop - LIVE NOW youtube.com/watch?v=LrN3E…
πŸ‘︎ 65
πŸ“°︎ r/cardano
πŸ’¬︎
πŸ‘€︎ u/tradefeedz
πŸ“…︎ Jan 18 2021
🚨︎ report
All geotagged metadata from the Parler dump as a .csv file with timestamps and video durations gofile.io/d/PUxeV4
πŸ‘︎ 182
πŸ“°︎ r/datasets
πŸ’¬︎
πŸ‘€︎ u/acanthias13
πŸ“…︎ Jan 13 2021
🚨︎ report
Stop posting pictures of your stack. There is metadata embedded into photos that include time and gps location.
πŸ‘︎ 22
πŸ“°︎ r/Wallstreetsilver
πŸ’¬︎
πŸ‘€︎ u/Quippykisset
πŸ“…︎ Feb 05 2021
🚨︎ report
Will "zfs send" preserve all file metadata and permissions?

I'm gonna move different datasets to another pool on my TrueNAS 12 U2 box, and I have the following two questions:

  1. If I use "zfs send" will it preserve all metadata and user permissions? And any suggestions to flags that I should add to the send command?
  2. Should I make a snapshot and then disable all services to prevent writes to the pool? Or should I take the pool offline and then "send" to ensure that I copy the most recent version of the pool?

Any suggestions or experiences with this is highly welcome.

πŸ‘︎ 8
πŸ“°︎ r/freenas
πŸ’¬︎
πŸ‘€︎ u/runevee
πŸ“…︎ Feb 20 2021
🚨︎ report
Deleted Scenes all have erotic poster and metadata of a 2010 gay movie of the same name.

I don’t know how to fix other than turn off scanning in my extras folder. I have my collection setup with movies in one folder and extras in another. Way too many now to integrate them.

Was planning on sharing my collection with my family but I can only imagine what my mother would say.

πŸ‘︎ 6
πŸ“°︎ r/emby
πŸ’¬︎
πŸ“…︎ Feb 20 2021
🚨︎ report
Setting up custom email campaigns is hard: This company used oddly specific user metadata in the name field of their email campaign.
πŸ‘︎ 149
πŸ’¬︎
πŸ‘€︎ u/goffstock
πŸ“…︎ Feb 02 2021
🚨︎ report
Metadata Question

Hello, Currently I have downloaded 50 to 60 short films from vimeo and youtube. I want to know that is there any way to add metadata to particular short film from its IMDb page ?

πŸ‘︎ 8
πŸ“°︎ r/datacurator
πŸ’¬︎
πŸ‘€︎ u/john_doe_57
πŸ“…︎ Feb 16 2021
🚨︎ report
Just finished programming a manga metadata tagger, so us hoarders finally have a comprehensive solution for building our digital manga collection!

New release and installation instructions available here now!

(I posted this in /r/selfhosted as well)

After about three arduous weeks of programming and testing, I have finally finished working on Manga Tagger. For those of you who collect digital comics, you may be familiar with Comic Tagger. Well, that’s what inspired this project of mine. It does the same thing as Comic Tagger, but tailored to Japanese manga, working directly with Free Manga Downloader.

For those of you who don’t know about FMD - you should definitely be using it to grab your manga. I’ve seen a few posts in search about methods of downloading manga, but I haven’t seen FMD mentioned. This is the best one I’ve came across in searching all the tools available, and it works extremely well (outside of not grabbing metadata.)

Onto Manga Tagger: I programmed it to be completely automated and it watches the FMD download directory for any new files that are downloaded to automatically process them. Manga Tagger renames them to a comic-esque format (My Hero Academia 001.cbz) and applies metadata in the form of comicinfo.xml. I used the ComicRack way instead of ComicBookLover as it’s the more supported format.

Manga Tagger grabs metadata from Anilist and MyAnimeList collectively (to get all relevant information.) MyAnimeList doesn’t have a native API so I make use of the Jikan library, and although it doesn’t natively implement any rate limiting, I programmed this functionality into Manga Tagger so that it’s not being abused. To also help on rate limiting, I make use of MongoDB to store any information so that it can easily be reused without having to be scraped again, and also so that it’s easily malleable (instead of dropping to a flat file or a static comicinfo.xml)

Manga Tagger is multi-threaded (or as multi-threaded as Python can be) and allows for multiple files to be processed at a time. It also has a fairly comprehensive logging solution, with support for console, files, JSON, TCP and JSON TCP. It keeps track of everything that was done, and has just enough information in the standard logs that if an error were to be encountered (it happens; we’re dealing with the internet after all) it can not only be easily traced, but easily reproduced and should allow for fairly fast fixes.

I currently use DataDog for log monitoring and it’s the best thing since sliced bread, I swear. The way logging is handled has really hel

... keep reading on reddit ➑

πŸ‘︎ 676
πŸ“°︎ r/DataHoarder
πŸ’¬︎
πŸ‘€︎ u/iVtechboyinpa
πŸ“…︎ Dec 08 2020
🚨︎ report
Special vdev questions - will it offload existing metadata and can i remove it if vdev is not mirrored?
  1. "zpool remove" should support removing special vdevs from the pool, but based on what i read i am unsure if this is possible in case the special device is NOT mirrored. I mean for example single point of failure zpool made out of 2 HDDs where 1 is for data and other for special device.
  2. Anyone knows how effectively the special vdev can offload the HDD work (mainly IOPs/tps wise) when this is added after the zpool is filled with the data? I mean if at all zfs will try to migrate/copy/move the metadata for already existing files to the special vdev and if this action can substantially (percentage estimate?) reduce IOPs/tps of the main zpool drive? Thank you
πŸ‘︎ 6
πŸ“°︎ r/zfs
πŸ’¬︎
πŸ‘€︎ u/postcd
πŸ“…︎ Feb 14 2021
🚨︎ report
Nearly 14,000 hacked Parler videos include personal computer usernames in the video metadata.
πŸ‘︎ 302
πŸ’¬︎
πŸ‘€︎ u/lvlsuxdik
πŸ“…︎ Jan 17 2021
🚨︎ report
Looks like Discord fixed the location metadata issue.
πŸ‘︎ 5k
πŸ“°︎ r/discordapp
πŸ’¬︎
πŸ‘€︎ u/Arowinal
πŸ“…︎ Nov 25 2020
🚨︎ report
[FEATURE] iOS 14.5 Beta 1: Music metadata scrolling in media controls is back

This was removed with the media controls redesign in 14.2. Long metadata (song title, album and artist) now scrolls again.

πŸ‘︎ 69
πŸ“°︎ r/iOSBeta
πŸ’¬︎
πŸ‘€︎ u/freaktheclown
πŸ“…︎ Feb 01 2021
🚨︎ report
With my Mini2, if you select English subtitles (VLC player) on playback, you get metadata on screen. This might already be well known. But I just just discovered this nifty feature. Photo of a church in Centralia Pennsylvania.
πŸ‘︎ 154
πŸ“°︎ r/dji
πŸ’¬︎
πŸ‘€︎ u/jepensedoucjsuis
πŸ“…︎ Jan 12 2021
🚨︎ report
I built a JSON'ified collection of Star Trek TNG, VOY, and DS9 episode transcripts with speaker, location, and some episode metadata, including a JSON schema github.com/jkingsman/Star…
πŸ‘︎ 147
πŸ“°︎ r/datasets
πŸ’¬︎
πŸ‘€︎ u/CharlesStross
πŸ“…︎ Jan 26 2021
🚨︎ report
Can anyone recommend a good app for bulk metadata editing?

As above. I’ve tried Windows properties and VLC and Plex still seems to find things like the notes left on the files and adds them back after I’ve removed them.

πŸ‘︎ 4
πŸ“°︎ r/PleX
πŸ’¬︎
πŸ‘€︎ u/DemisGiamalis
πŸ“…︎ Feb 13 2021
🚨︎ report
ILPT: If you steal a DSLR camera, change/disable the metadata that gets added to the photos. Many cameras include the serial number in the metadata of the photos it takes. Tools exist to trace any photos taken with your camera once they are posted online.
πŸ‘︎ 6k
πŸ’¬︎
πŸ‘€︎ u/JohnDoe_2408
πŸ“…︎ Dec 15 2020
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.