.: Poor :.
Released 19 years ago. Nov 2002By Bernhard M�hr
- Coded by: Bernhard M�hr
- Version: Poor
- Released date: Nov 2002, 19 years ago.
- Family: Poor
- Category: Remote Access
Server: size: 10,240 bytes
Dear Ladies and Gentlemen, I'm a software developer and have started to write programs with MS .net technology. There is a possibility to create assemblies (=programs) on the fly without the need of a compiler. I think this is a quite dangerous feature. For example you can write a (useful) program and integrate a very small piece of code as backdoor. My "proof of concept"-program runs in background and if the user visits a prepared website, the code for the virus is extracted from the html-page, an assembly created and executed. In the background-program are no malicious commands, it needs only a runtime version of MS .net to be started. The whole code for the virus is located in the website as comments. On bigger websites and with some effort the code could be hidden better and nobody could find it. My "proof of concept" waits for the creation of a *.htm file on drive C: (happens surfing through the internet due the caching of IE) and extracts the virus code from there (the page --/URL REDACTED BY SUB7CREW.ORG FOR YOUR SAFETY\-- contains the virus). The virus lists all files from drive C: and deletes the file C:\ProofVirus.txt. The creation of a virus is simple: Write the virus in C#, compile is and disassemble it with the .net disassemble and you can write your own code into the html-page (please notice not all commands are supported by my executable but its only a matter of some minutes to add more). The application is only a proof of concept but could be developed more sophisticated quite easy. It works on the PCs i tested, I hope it also works on yours for demonstration. I think there are 2 possible points to increase security: - The on the fly creation of assemblies in .net is very dangerous -> protect it - Usual documents like html, doc, bmp, jpg, ... should be scanned by the virus-scanner not for malicious commands, I think all content not understood by the (usual) application or disabled by the user should be filtered out. In other words the content should run through a filter that for example only allows valid html to pass through (I'm thinking of a technology like in webwasher but more sophisticated). Yours Bernhard M�hr
URL's and mails were automatically redacted (filtered) for reader's safety. However the filter is not perfect and can't find all harmful elements. If you find something dangerous including file link, website, mail address, profanity... contact me immediately at firstname.lastname@example.org, thank you in advance.